Pop has not received feature updates for years, because the dev team focuses on implementing Cosmic.
Given the overall progress of Linux Desktop environments, this might have led many users to switch away from Pop.
Pop has not received feature updates for years, because the dev team focuses on implementing Cosmic.
Given the overall progress of Linux Desktop environments, this might have led many users to switch away from Pop.
Wake me up when something happens
Interessant, dass du die Qualität nicht so gut findest. Ich fand sie bisher nämlich enorm gut und bin für meine Chinos, Shirts und Pullis fast ausschließlich auf Uniqlo umgestiegen.
Uniqlo hat viele verschiedene Basics. Auch schön dicke aus Baumwolle.
I also recommend Deep Learning by Goodfellow. Long chapter about the mathematical foundations. I found it very insightful as side lecture during my studies. Also, the online version is free.
Ok that was dark.
They also love CPU.
For a dev machine, I wouldn’t go below 16GB RAM and at least a current-generation i5 (or Ryzen equivalent).
I was in the same boat as you, i.e. using the GPU during my studies. My premise is to optimise the most frequent use case, i.e., deep learning.
IMO going with NVIDIA will save you so much worries and frustration that it clearly outweighs the downsides of worse Wayland support compared with AMD.
When you have tough university assignments/projects, you want to focus on the actual problem instead of debugging/compiling libraries for use with AMD. I am sure that with a bit of work many libraries can be made to work with AMD, but apparently it is still a pain oftentimes.
So I strongly suggest choosing NVIDIA. Disclaimer: have not used AMD for deep learning yet, but have monitored the development of AMD support, because I would like to switch to AMD.
Btw. I found Pop!OS to be very nice for both “regular” university work and all computer science tasks.
Soo…what’s the hash?
What exactly happened? The reasoning in the graphic does not tell me much. I only saw the summary which listed him as second. He seemed having finished his lap before the red flag, no?
I had the pleasure of conducting research into self-supervised learning (SSL) for computer vision.
What stood out to me was the simplicity of the SSL algorithms combined with the astonishing performance of the self-supervisedly trained models after supervised fine-tuning.
Also the fact that SSL works across tasks and domains, e.g., text generation, image generation, semantic segmentation…
So nice to have Nico alongside Crofty. I love the insight he gives, really adds a lot to the race.