Yes, I did search for the answer and did not see one.
I see posts where members state they have a site with thousands of pages of content and more recently saw a post where someone said they were testing some new strategy so they put up a site, threw in about 170 pages of content, and then started hitting it with the new strategy.
My question is - is there something to obtaining large amounts of content that is not duplicate? Are they putting 5 pages unique and then 165 pages spun duplicates? Just looking for some input.
I would love to "throw in 170 pages of content" into several of my sites. Should I just take a public domain article (or 1-5 unique that I write ) and spin it/them to the maximum settings?
Thank you in advance to anyone that can point me in the right direction.
I see posts where members state they have a site with thousands of pages of content and more recently saw a post where someone said they were testing some new strategy so they put up a site, threw in about 170 pages of content, and then started hitting it with the new strategy.
My question is - is there something to obtaining large amounts of content that is not duplicate? Are they putting 5 pages unique and then 165 pages spun duplicates? Just looking for some input.
I would love to "throw in 170 pages of content" into several of my sites. Should I just take a public domain article (or 1-5 unique that I write ) and spin it/them to the maximum settings?
Thank you in advance to anyone that can point me in the right direction.