It supports it, but it’s opt-in by apps.
Enabling compression is another option (Though with a speed and size penalty), it’s user visible at least.
made you look
It supports it, but it’s opt-in by apps.
Enabling compression is another option (Though with a speed and size penalty), it’s user visible at least.
There are different kinds of solar power generation, the photovoltaic panels that generate electricity directly that we all know and love, and thermal solar. You’ll commonly see a small-scaled version of this used on homes as a hot water system.
Scale it up though and you’ve got a system that can generate energy 24/7, as long as you’ve got enough thermal mass, and sunlight.
I quite liked the locale in FC5, but the (nearly?) unavoidable captures the game would force on you when you did too much open world stuff annoyed the hell out of me.
Then I had the ending spoiled for me and I just got too annoyed at the story planners and never touched it again.
Then don’t get me started about how the www subdomain itself no longer makes sense. I get that the system was designed long before HTTP and the WWW took over the internet as basically the default, but if we had known that in advance it would’ve made sense to not try to push www in front of all website domains throughout the 90"s and early 2000’s.
I have never understood why you can delegate a subdomain but not the root domain, I doubt it was a technical issue because they added support for it recently via SVCB
records (But maybe technical concerns were actually fixed in the decades since)
Chromium had it behind a flag for a while, but if there were security or serious enough performance concerns then it would make sense to remove it and wait for the jpeg-xl encoder/decoder situation to change.
Adobe announced they were supporting it (in Camera Raw), that’s when the Chrome team announced they were removing it (due to a “lack of industry interest”)
They’re “file like” in the sense that they’re exposed as an fd
, but they’re not exposed via the filesystem at all (Unlike e.g. unix sockets), and the existing API is just mapped over the sockets one (i.e. write()
instead of send()
, read()
instead of recv()
). There’s also a difference in how you create them, you open()
a file, but connect()
a socket, etc.
(As an aside, it turns out Bash has its own virtual file-based wrapper around sockets, so you can do things like cat
a remote port with Bash, something you can do natively in Plan 9)
Really it just shows that “everything is a file” didn’t stand up in practice, there’s more stuff that needs special treatment than doesn’t (e.g. Interacting with TTYs also has special APIs). It makes more sense to have a better dedicated API than a generic catch-all one.
RFC 3339 is a simplified profile of 8601 that only covers YYYY-MM-DD style formatting, if you only ever use that format and avoid the things like “2024-W36” they’re mostly interchangeable.
Plan 9 even extended the “everything is a file” philosophy to networking, unlike everybody else that used sockets instead.
Existing JPEG files (which are the vast, vast majority of images currently on the web and in people’s own libraries/catalogs) can be losslessly compressed even further with zero loss of quality. This alone means that there’s benefits to adoption, if nothing else for archival and serving old stuff.
Funny thing is, there was talk on the Chrome bug tracker of using just this ability transparently at the HTTP layer (like gzip/brotli compression), but they’re so set on pushing their AVIF format that they backed away from it.
You can’t do normal BitTorrent in browsers, there’s no support for plain sockets that you’d need to communicate with other peers, WebTorrent is technically a new protocol that implements the BT semantics over stuff the browsers do provide (So you can proxy between the different swarms, that’s the “hybrid” nodes in the image on the WebTorrent page)
But it turns out it’s all a moot point, since PeerTube removed WebTorrent support anyway in favour of their own P2P system
Edit: Ok so I misunderstood, and it seems like it’s a bit complicated. The server can (it’s disabled by default) use WebTorrent to import videos, the client still uses the WT trackers to find peers but uses a different protocol to actually share the video data.
There’s this tool that provides the ability to automatically seed videos, but development has stalled because no up to date client will ever make use of it.
I think the one remaining use is the “download as torrent” option, but even then that’s just using a web seed, so it’s just an alternative way to download the video.
Unfortunately WebTorrent isn’t compatible with normal BitTorrent, so unless you’re using a client that specifically supports it, you’re not helping out any PeerTube clients
His personal LLC is called “Excession”, considering some of the plot points in that book I doubt he enjoyed it at all, it’s just “nerd set dressing”.
At the time it was just an ad-lib by Jason Issacs, guessing he wished on a monkey’s paw for it to make sense in context.
What’s the problem with that, though? Systems like that are pretty much guaranteed to be isolated from the internet.
Because things break down eventually, and when it comes time to buy replacement parts you discover that they’re effectively impossible to find. Then instead of having a nice, planned transition period you’ve got like a weekend to cobble together something to get it working again.
Yep, our center-left government recently announced plans to keep using natural gas for at least another 25 years
But it’s ok, because we’ll work out carbon capture in the future! Which is the exact same notion that our previous right wing government based their policy on.
Ideally you don’t directly ship the code it outputs, you use it instead of re-writing it from scratch and then slowly clean it up.
Like Mozilla used it for the initial port of qcms (the colour management library they wrote for Firefox), then slowly edited the code to be idiomatic rust code. Compare that to something like librsvg that did a function by function port
c2rust: Am I a joke to you?
It’s a tad out of date, but the Second Doctor claims he received a medical degree after studying under Joseph Lister in 1888.