Is Seaside/Squeak SEO-friendly? My website isn't showing up in any
search results on Google, Yahoo!, or MSN. Apparently, spiders have a hard time scanning my website. Perhaps because of the "dynamic" nature of the application? Thanks, Richard _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
2008/5/12, Richard Eng <[hidden email]>:
> Is Seaside/Squeak SEO-friendly? No, in general anything that requires a session is not. > My website isn't showing up in any search > results on Google, Yahoo!, or MSN. Due to the sheer market share you can afford to concentrate on Google only. To get more feedback register for Google Webmaster Tools: www.google.com/webmasters/tools > Apparently, spiders have a hard time > scanning my website. Perhaps because of the "dynamic" nature of the > application? Very likely, remember each search result must be "linkable" Cheers Philippe _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
> > Is Seaside/Squeak SEO-friendly?
> > No, in general anything that requires a session is not. It can be made SEO-friendly though. See Pier with examples like seaside.st, lukas-renggli.ch, etc. Lukas -- Lukas Renggli http://www.lukas-renggli.ch _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by Richard Eng
I'm having great difficulty loading Pier from SqueakMap.
Lots of errors. Whatever portion I have successfully loaded, I've had a
cursory look at, and it's very complicated. It would seem that it'll take me
months to understand the Pier code.
Is it possible to give me a few pointers that can
quickly get me to what I need to know? I don't have that much time and resources
to go wading through Pier just to figure out how to make my app
SEO-friendly.
Thanks,
Richard
_______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
> I'm having great difficulty loading Pier from SqueakMap. Lots of errors.
> Whatever portion I have successfully loaded, I've had a cursory look at, and > it's very complicated. It would seem that it'll take me months to understand > the Pier code. Try the one-click image or start from a fresh 3.9 image, that usually works without problems. > Is it possible to give me a few pointers that can quickly get me to what I > need to know? I don't have that much time and resources to go wading through > Pier just to figure out how to make my app SEO-friendly. Have a look at #updateRoot: and #initialRequest: of WABrowser that comes with Seaside. This is simpler. What I wanted to point out with Pier is that the web-sites are properly indexed by Google. The implementation is obscure indeed, because I have to keep backward comparability with existing code. Lukas -- Lukas Renggli http://www.lukas-renggli.ch _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by Richard Eng
When I try to load Pier into a fresh Squeak 3.9-7067 image, I always get a
warning that I need MADelegatorAccessor. I don't know where to get this from, so I just proceed anyway. Then I get all kinds of errors. I tried loading Magritte first but it didn't help. Richard _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
> When I try to load Pier into a fresh Squeak 3.9-7067 image, I always get a
> warning that I need MADelegatorAccessor. I don't know where to get this > from, so I just proceed anyway. Then I get all kinds of errors. > > I tried loading Magritte first but it didn't help. There is something wrong with the package dependencies. I fixed that. Ensure that you clear the cache (remove the sm-folder) before opening the image and that you "update the map from the net" before loading "Pier". Although SqueakMap works here, I suggest to load from PackageUniverses. This is usually more likely to get the dependencies right. Lukas -- Lukas Renggli http://www.lukas-renggli.ch _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by Lukas Renggli
I just looked at seaside.st and lukas-renggli.ch and all the internal links have session information.
Can you elaborate more on what it takes to make Pier SEO-friendly?
|
You can make entry points
http://www.hpi.uni-potsdam.de/swa/seaside//tutorial?_k=ryHLiysq&_s=cKcuDxudjEFewMRq#part4 marze! El 05/10/2008, a las 1:47, david54 escribió: > > I just looked at seaside.st and lukas-renggli.ch and all the > internal links > have session information. > Can you elaborate more on what it takes to make Pier SEO-friendly? > > > Lukas Renggli wrote: >> >>>> Is Seaside/Squeak SEO-friendly? >>> >>> No, in general anything that requires a session is not. >> >> It can be made SEO-friendly though. See Pier with examples like >> seaside.st, lukas-renggli.ch, etc. >> >> Lukas >> >> -- >> Lukas Renggli >> http://www.lukas-renggli.ch >> _______________________________________________ >> seaside mailing list >> [hidden email] >> http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside >> >> > > -- > View this message in context: http://www.nabble.com/SEO-and-Seaside-tp17187198p19819257.html > Sent from the Squeak - Seaside mailing list archive at Nabble.com. > > _______________________________________________ > seaside mailing list > [hidden email] > http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
If I understand things correctly, using this technique will create a new session for each request.
-david On Sat, Oct 4, 2008 at 7:01 PM, marze <[hidden email]> wrote: You can make entry points _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by david54
> I just looked at seaside.st and lukas-renggli.ch and all the internal links
> have session information. > Can you elaborate more on what it takes to make Pier SEO-friendly? As a first step you need to make parts of your application restful using #initialRequest: and #updateUrl:. This is something that comes built-in with Pier. Then there are different possibilites: 1. Leave it like that. Google and other search engines will show links with _s and _k from expired session, but since these links restful they silently create a new session at the given location. 2. Use the Google Webmaster Tools and make it strip _s and _k from indexed results. This is what I am doing on my Pier site: <http://www.google.com/search?q=site:lukas-renggli.ch>. 3. Provide a sitemap. This is an XML file listing all the restful entry points to your application. AFAIK sitemap are considered by several major engines up to now. Philippe wrote a sitemap plugin for Pier. HTH, Lukas -- Lukas Renggli http://www.lukas-renggli.ch _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by david54
2008/10/5, david54 <[hidden email]>:
> > I just looked at seaside.st and lukas-renggli.ch and all the internal links > have session information. > Can you elaborate more on what it takes to make Pier SEO-friendly? It works pretty well akshully: http://www.google.ch/search?q=useful+site%3Aseaside.st Cheers Philippe _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
Excellent - I wonder why I keep seeing comments that insist that you need friendly URLs for SEO and bookmarking. Is that state information or is there something else?
On Sun, Oct 5, 2008 at 5:17 AM, Philippe Marschall <[hidden email]> wrote: 2008/10/5, david54 <[hidden email]>: _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
On Sun, 5 Oct 2008 10:22:18 -0500
"David Pennell" <[hidden email]> wrote: > Excellent - I wonder why I keep seeing comments that insist that you need > friendly URLs for SEO and bookmarking. Maybe because a) not all search engines allow parameters on indexed pages b) a) was true when the commenters learned their ways around SEO s. _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
On Oct 5, 2008, at 17:33 , Stefan Schmiedl wrote: > On Sun, 5 Oct 2008 10:22:18 -0500 > "David Pennell" <[hidden email]> wrote: > >> Excellent - I wonder why I keep seeing comments that insist that >> you need >> friendly URLs for SEO and bookmarking. > > Maybe because > a) not all search engines allow parameters on indexed pages > b) a) was true when the commenters learned their ways around SEO Unfortunately, its not that simple... Its not because parameters are not supported, but because search engines have troubles to deal with them. If you use parameters the search engine cannot uniquely identify a page because URLs to the same page are always different (even inside the same session links to the same page always have a different URL because of the callback keys). If you look at http://www.google.ch/search?q=useful+site%3Aseaside.st you see that the Trivia page occurs twice, even though Google is hiding similar pages (if you make it show all results you get a lot more duplicate entries). This is problematic, not only because search results show up multiple times, but also because - having multiple URLs can dilute link popularity [1] - Google may penalize the ranking of sites with duplicate content [2] If you cannot get rid of the parameters I think that what Lukas suggested earlier in the thread makes sense (that is, provide a sitemap and use webmaster tools to hide parameters). Cheers, Adrian [1] http://googlewebmastercentral.blogspot.com/2007/09/google-duplicate-content-caused-by-url.html [2] http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=66359 > > > s. > _______________________________________________ > seaside mailing list > [hidden email] > http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by Lukas Renggli
Lukas Renggli wrote:
I just looked at seaside.st and lukas-renggli.ch and all the internal links have session information. Can you elaborate more on what it takes to make Pier SEO-friendly?As a first step you need to make parts of your application restful using #initialRequest: and #updateUrl:. This is something that comes built-in with Pier. Then there are different possibilites: 1. Leave it like that. Google and other search engines will show links with _s and _k from expired session, but since these links restful they silently create a new session at the given location. 2. Use the Google Webmaster Tools and make it strip _s and _k from indexed results. This is what I am doing on my Pier site: <http://www.google.com/search?q=site:lukas-renggli.ch>. OK, I'll bite (please excuse me dragging up an old post). How exactly do you "Use the Google Webmaster Tools and make it strip _s and _k from indexed results" Nevin 3. Provide a sitemap. This is an XML file listing all the restful entry points to your application. AFAIK sitemap are considered by several major engines up to now. Philippe wrote a sitemap plugin for Pier. HTH, Lukas _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
in case you arent aware, there is a still better way to deal with duplicate content and google indexing at this point. _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
In reply to this post by Nevin Pratt
2009/4/26 Nevin Pratt <[hidden email]>:
> Lukas Renggli wrote: > > I just looked at seaside.st and lukas-renggli.ch and all the internal links > have session information. > Can you elaborate more on what it takes to make Pier SEO-friendly? > > > As a first step you need to make parts of your application restful > using #initialRequest: and #updateUrl:. This is something that comes > built-in with Pier. > > Then there are different possibilites: > > 1. Leave it like that. Google and other search engines will show links > with _s and _k from expired session, but since these links restful > they silently create a new session at the given location. > > 2. Use the Google Webmaster Tools and make it strip _s and _k from > indexed results. This is what I am doing on my Pier site: > <http://www.google.com/search?q=site:lukas-renggli.ch>. > > > OK, I'll bite (please excuse me dragging up an old post). > > How exactly do you "Use the Google Webmaster Tools and make it strip _s and > _k from > indexed results" > IMO, the best is to use sitemaps (XML plan of your site) in my opinion and to submit them to the different search engine you like... I find it nice because you have control on what's indexed. There's also the canonical option. Cheers, > HTH, > Lukas > > > > _______________________________________________ > seaside mailing list > [hidden email] > http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside > > -- Cédrick _______________________________________________ seaside mailing list [hidden email] http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/seaside |
Free forum by Nabble | Edit this page |