Advanced SEO Question: coming up in the SERPs for millions of DB-generated pages

avatar33

e-Hustler
Dec 5, 2009
3,838
52
48
Calgary, AB
Suppose you have a website about tech stock prices, and you create a page that lets you extract the price of a stock at any point in time. So for instance the page would have 3 fields:

Stock name:

Date: (selector)

Price: (Displayed after submit button was clicked)

Now, the querying of the DB would and display of the result would happen on the same page, but suppose you want to come up on Google for searches like:

Tesla stock price January18th 2013
Apple stock price March 9th 2001

...and all these millions of delicious longtail keywords/permuations possible, how would you go about that? Do you need to create a page for each stock and each date, so 365 pages per stock per year? How the fuck do you do that anyway? And what do you do then? You stuff all the hyperlinks on a page of the site? I'm sure we all agree that that looks spammy and unprofessional. A hidden XML sitemap perhaps? How the hell do you do this without raising red flags?
 


Redirects. You have one template, so for example, domain.com/ticker/STOCK/DATE/

Anything sent to /ticker/ on the server gets picked up by a script, gets the stock & date, and displays the appropriate info. Only one template is required. For example, domain.com/ticker/APPL/2016-01-12/ would display Apple price share on Jan 12th, 2016.

This is basic.

And why aren't you over at apexforum.com?
 
  • Like
Reactions: dzianis
Redirects. You have one template, so for example, domain.com/ticker/STOCK/DATE/

Anything sent to /ticker/ on the server gets picked up by a script, gets the stock & date, and displays the appropriate info. Only one template is required. For example, domain.com/ticker/APPL/2016-01-12/ would display Apple price share on Jan 12th, 2016.

take yahoo api
http://chartapi.finance.yahoo.com/instrument/1.0/tsla/chartdata;type=quote;range=10y/csv
http://chartapi.finance.yahoo.com/instrument/1.0/aapl/chartdata;type=quote;range=10y/csv

and organize into tables?
add news article links relevant to the stock from the respective dates?
I guess if you make it too simple google is just going to show the prices themselves
 
Suppose you have a website about tech stock prices, and you create a page that lets you extract the price of a stock at any point in time. So for instance the page would have 3 fields:

Stock name:

Date: (selector)

Price: (Displayed after submit button was clicked)

Now, the querying of the DB would and display of the result would happen on the same page, but suppose you want to come up on Google for searches like:

Tesla stock price January18th 2013
Apple stock price March 9th 2001

...and all these millions of delicious longtail keywords/permuations possible, how would you go about that? Do you need to create a page for each stock and each date, so 365 pages per stock per year? How the fuck do you do that anyway? And what do you do then? You stuff all the hyperlinks on a page of the site? I'm sure we all agree that that looks spammy and unprofessional. A hidden XML sitemap perhaps? How the hell do you do this without raising red flags?

TSLA Historical Prices | Tesla Motors, Inc. Stock - Yahoo! Finance
AAPL Historical Prices | Apple Inc. Stock - Yahoo! Finance

Apple Inc. (AAPL) Historical Prices & Data - NASDAQ.com
Tesla Motors, Inc. (TSLA) Historical Prices & Data - NASDAQ.com

I don't think you want to (or need to) generate a separate indexable page for each permutation, seems like a good way to get hit by Panada
 
Here's a thought: Why would a search engine care about millions of pages on a website that have little unique content, and little external signalling that they're important?

(think about why this matters for 3rd party sites as well)
 
^Search engines SHOULD care because I would be one of the only sites to offer this type of data. Trust me the competition stinks in this niche that I'm in. I used tech stocks as an example, but I'm in a whole other niche that is growing in popularity exponentially, and trust me the type of data I have would be useful to a LOT of readers. It's just a matter of making a logical site strucuture that searchengines would index and rank. So far I've always built sites where I got content writers to create each page. This is the first time I have a non-content site, which is almost entirely database driven, so I'm trying to figure out how to make this work and have millions of pages indexed (even if it takes years).
 
Fixed

Advanced SEO Question: coming up in the SERPs for millions of DoucheBags -generated pages
 
This is super super easy level stuff even for basic web developers. I could build that in my sleep although I've graduated beyond projects like that and do real web apps now.

Pay some indian guy on upwork to do it, anyone on wickedfire will just steal your niche.
 
This is super super easy level stuff even for basic web developers. I could build that in my sleep although I've graduated beyond projects like that and do real web apps now.

Pay some indian guy on upwork to do it, anyone on wickedfire will just steal your niche.

indian
quality

pick one.