this post was submitted on 19 Aug 2023
54 points (100.0% liked)
Lemmy
12572 readers
3 users here now
Everything about Lemmy; bugs, gripes, praises, and advocacy.
For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'll have to play with it. I haven't looked through the code, but I'm pretty sure the default Lemmy UI isn't SEO-friendly as well (though kbin probably is), so it would be cool to have a tool that provides it.
But maybe it doesn't make sense for it to be your tool. I think it would be cool to have a lemmy feature where it'll serve a cached, static page if it detects a bot is crawling it or the client doesn't support JavaScript, and serve the regular site otherwise. Maybe your tool is the right option, or maybe someone should build a different one. IDK.
You're right, this tool isn't designed to address this problem and is ill-suited.
Lemmy should definitely render a static page and then "hydrate" that with JavaScript in the client. This is a common problem with modern js apps. SSR (server side rendering) is the solution but it can be very complex. You really need to build the whole app with SSR in mind, it's not really something to bolt on as an additional feature at the end.
I think we could get away with sniffing the user agent and serving up static content instead if it looks like a bot.
I'm a full stack dev by day, but everything I've worked on has been a web app that doesn't need SEO, so I'm not exactly sure how that works in practice. But presumably we could generate and cache a basic webpage periodically for each post for use by bots and perhaps for accessibility (e.g. very basic HTML that's just good enough to use with links or something). It would have the same content as the main page, but none of the JS or CSS.
It shouldn't be too hard to render with a templating library.