Commit 28f4d148 authored by Eugen Rochko's avatar Eugen Rochko

Update content/posts/2019-05-05_Blurhash/index.md

parent 9061c37e
Pipeline #731 passed with stages
in 18 seconds
......@@ -15,7 +15,7 @@ Mastodon allows you to put content warnings on post. These can be textual, hidin
Beyond providing visual protection against say, co-workers looking over your shoulder to see something inappropriate on your screen, Mastodon also does not load said images or videos at all until you choose to unhide them, which helps if it's important that inappropriate content is not stored in your browser's cache. But there is a drawback. Every post with hidden media looks the same. They all blend together. Especially in public timelines, which provide a stream of all public posts that people use to explore Mastodon outside of their friend circle. As a result, posts with hidden media usually get less interactions.
{{< figure src="blurhash-demo-cat.png" caption="Side-by-side comparison of the original picture and the generated blurhash" >}}
{{< figure src="blurhash-demo-cat.png" caption="Side-by-side comparison of the original picture of Doris (cat) and the generated blurhash, which is the string `KJG8_@Dgx]_4V?xuyE%NRj`" >}}
Here comes Blurhash. Developed by [Dag Ågren][WAHa_06x36], who is behind the popular iOS app for Mastodon, [Toot!][toot], it is an algorithm that compresses a picture into a short string of letters. The string is so small that there is no problem with saving it in the database, instead of as an image file, and conversely, sending it along with API responses. That means that string is available before any image files are loaded by the browser. You can see where this is going... When you decode the string back into an image, you get a gradient of colors used in the original image.
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment