Hugo is a gateway for discovering neat golang libraries. Version 0.104.0 introduced a color extraction method that has lots of use–cases. An easy one is to generate basic image gradient placeholders. The browser has its own deferred/lazy loading logic so fancy image gradients (on a static site) require only a few lines of pre–generated styles.
Reverse pagination is a counter–intuitive strategy for attempting to make links immutable/cacheable and bookmark friendly across older pages. I searched for a visual explanation (difficult to explain concisely) and eventually arrived at an old article on paging . Reverse pagination has its gotchas, but then again pagination itself is one big gotcha.. :-) Well, it depends on the use case really.
I was reading web.dev recently and couldn’t help but
think that in tech circles/articles online it’s easy to get the impression that
Firefox is a major competing
browser. Firefox actually doesn’t even register. Firefox has an estimated 3%
points in
global market–share and
doesn’t even show up in
mobile market–share
estimates.
Users implicitly use Safari (Apple) and/or some derivation of Chromium (Google Chrome). When was the last time you saw someone using Firefox? Not recently if you’re outside the tech bubble.
It’s kinda neat how
CSS animation rules
are sort of simple in their animation-delay
property also allowed delays between iterations/intervals instead of at the
start only. Interval delays could allow for writing drastically less key frame
rules.
text-animation[hang] {
animation: tilt-rightward 1.3s infinite, tilt-leftward 1.8s infinite;
}
The temptation to bring in a bundler is oh so very
great. Deno bundle
is
obviously not designed to bundle js
directly for the browser but you can get
away with it up to a certain point.
Here’s a muse; linking to external sites is probably one of the harder parts of
blogging. Pages can go 404
and you won’t know exactly why,
change even though they’re
supposed to be immutable.
To blog while having pointers to disparate sources requires checking for dead links, and verifying that content relevancy hasn’t changed. The solution is to either archive everything (hard) or to not link at all (easy). High mutability is one reason why people take pictures of online content — it just works.
Of course in my case the social media applications within the Fediverse are not what’s most interesting. It’s the generality of its protocol. ActivityPub appears to have an easier time with different use cases than other protocols.
The link to a Fediverse server list in a previous post died but fediverse.party also shows the diverse types of applications that use the ActivityPub protocol.
Out of all front–ends for
Mastodon/Pleroma the one
with the funniest name is Soapbox
(repo). Social media is a
Insularity is a term that shows up in the philosophy of the Fediverse every so often. Generally, the more insular a community, the more populous/extreme/niche. Some Fediverse indexes calculate insularity rates.
Imagine similar stats for the . Like — what’s the and domain linking insularity within/between Facebook relative to other small/large islands? That would be extraordinary, especially for journalism which is pretty much dead.
This goes without saying but… any sufficiently ambitious group of business persons want more than just to be on the Internet, they want to own it completely, the Fediverse is no exception :)
Back when looking into JSON Feeds it didn’t link to its schema, but there’s one in SchemaStore/schemastore and sonicdoe/jsonfeed. You could validate a feed with a tool like check-jsonschema.
$ check-jsonschema --schemafile jsonfeed-v1.1.json feed.json
ok -- validation done
The upside of the Feed is a lot more human than technical in the sense that it’s a mapping of /Atom to the world. There are many people who prefer JSON over .