

I realise you have to be somewhat off the rocker to be a billionaire CEO, but Pat is showing more of that than I expected here.


I realise you have to be somewhat off the rocker to be a billionaire CEO, but Pat is showing more of that than I expected here.
To make the desktop experience bearable: AltTab, Forklift, Rectangle, Ukelele, MonitorControl, Amphetamine, Firefox, Thunderbird, qView and duti to set the latter three up as the defaults.
As a package manager I’m pretty happy with nix-darwin, now I get all the CLI tools there, and what isn’t packaged, like wireshark for example, I get through my nix-controlled homebrew.
Coming from a Linux userland you might want to replace some coreutil packages with their GNU variants. I ran into one case where the GNU grep was much faster than the BSD version preinstalled in macOS for example.
What I haven’t found a good solution to yet is Filesystem support. Both NTFS and ext4 are missing. I currently have a Linux VM just for that. I think Paragon sells a driver, have been meaning to look into it more, but haven’t.
Edit: To be fair to macOS the App called Preview is a pretty good PDF reader in my view.
PS: If you ever need to use dd on macOS, be aware that there are /dev/rdisk handles instead of /dev/disk for the un-buffered access. Its significantly faster for dd shoveling.
PPS: You will probably have to turn off what they call “natural” scroll. macOS inverts the default for some reason.


No, 27,38 years
when I look at Gnome I don’t doubt for a second where I want to be
Yeah me neither, from the other side, lol


I’m also not familiar with how these things work. But it looks like the problematic commit was reverted:


All these naysaysers in the comments here… It’s obvious you have to keep the development pipeline moving. Just because we have one free codec at the stage of hardware support now does not mean the world can stop. There are always multiple codecs out there at various stages of adoption, that’s just normal.
For me that worked, about 9 hours ago. Maybe there is more load now that the Americas are awake?
Can nvenc do dual pass encodings these days?


Yes, movie people complain that more than 24 fps looks like soap operas (because digital TV studio cameras moved to 60 fps first).


I think it’s all performative bullshit, not good policy.
Some decision maker has to appear innovative to his superiours, so he decides to have some number of locations assigned to a trial group and some bullshit installed. Even if it fails, just as long as he finds the right moment to start appearing critical of the experiment he can still pull off his play. After all moving fast and failing fast are also virtue in modern corporate bullshit lingo.


By their admins setting HKEY_CURRENT_USER\Software\Policies\Microsoft\Office\16.0\Common\General PreferCloudSaveLocations to 0 using GPO probably


Not sure about the enconding
Right click on video -> Stats for Nerds


If you look at the desktop, there is AMD and there is Apple silicon
You can get workstations with Ampere Altra CPUs that use an ARM ISA. It’s not significant in the market, more of a server CPU put in a desktop for developers, but it provides a starting point, from which you could cut down the core count and try to boost the clocks.
There is also the Qualcomm Snapdragon X Plus with some laptops on the market from mainstream brands already (Asus Zenbook A14, Lenovo ThinkPad T14s Gen 6, Dell Inspiron 5441). That conversely could probably scale up to a desktop design fairly quickly.
You’re right that we’re not there, but I don’t think we’re that far off either. If Intel keeled over there would be a race to fill the gap and they wouldn’t leave the market to AMD alone.


Yeah if you build a RISC processor directly you can just save the area needed for instruction decode.


I don’t think the centralised approach works either. If you bake that grouping metadata of individual popular pages into Firefox you have an issue with keeping it current if page content changes. And you have a difficult trade-off between covering enough pages vs not blowing up the size too much. And the approach can’t work for deep web pages, e.g. anything people can only see when logged in.
Ignoring all that: The groupings you could pre-process would be static and determined over some assumed average user behaviour, not an actual cluster of a specific users themes. You take some hardcore Warhammer 40k fan, and all his tabs on minis and painting techniques and rulebooks and fan media, and apply the static grouping then it all goes into “Warhammer”. However if you ran it locally it might come up with “Painting” “Figures” “Rules” “Fanart” or whatever. It would produce a more fine grained clustering for someone who is deep into a specific niche interest, and a more coarse grained one otherwise.
So I think fundamentally it’s correct to cluster locally and dynamically for a usable result. They need to make it opt-in, and efficient enough. Or better yet they could just abandon the idea because it’s ultimately not that much use compared to the required inference cost.


Sounds more like they are maybe using ML classifiers on all the communications they are spying on by conventional means. To me that’s not the same as using AI to spy but whatever.


Sure there are a few everywhere, but the big gaps are the issue.
For example in your screenshot if you zoom in on Poitiers you’ll see there are none there, only in the two northern neighbor communes Neuville de Poitou and Jaunay-Clan. Similar for Nantes, none there, they are all in Saint-Sébastien-Sur-Loire and Thouaré-sur-Loire, the center and all the other suburbs have nothing.


if that means where ever you go there will be free internet at hand that can be relied upon
Yeah if that were the case it could be useful. Unfortunately the map looks pretty bad: https://wifi4eu.ec.europa.eu/#/list-accesspoints
That was actually preinstalled by IT at my workplace! It’s a pretty nice little archiver. Seconded.