DefederateLemmyMl

  • Gen𝕏
  • Engineer ⚙
  • Techie 💻
  • Linux user 🐧
  • Ukraine supporter 🇺🇦
  • Pro science 💉
  • Dutch speaker
  • 2 Posts
  • 409 Comments
Joined 11 months ago
cake
Cake day: August 8th, 2023

help-circle






  • most PCs by that time had built-in MIDI synthesizers

    Built-in? You had AdLib cards for FM synthesis, but they were never built-in and most PCs didn’t even have them. Adlib cards used the Yamaha OPL2 or OPL3 chip.

    Along came Creative Labs with their AWE32, a synthesizer card that used wavetable synthesis instead of FM

    You are skipping a very important part here: cards that could output digital audio. The early Soundblaster cards were pioneers here (SB 1.0, SB 2.0, SB Pro, SB16). The SB16 for example was waaaaay more popular than the AWE32 ever was, even if it still used OPL3 based FM synth for music. It’s the reason why most soundcards in the 90s were “Soundblaster compatible”.

    Digital audio meant that you could have recorded digital sound effects in games. So when you fired the shotgun in Doom to kill demons, it would play actual sound effects of shotgun blasts and demon grunts instead of bleeps or something synthesized and it was awesome. This was the gamechanger that made soundcards popular, not wavetable.

    The wavetable cards I feel were more of a sideshow. They were interesting, and a nice upgrade, especially if you composed music. They never really took off though and they soon became obsolete as games switched from MIDI based audio to digital audio, for example Quake 1 already had its music on audio tracks on CD-ROM, making wavetable synthesis irrelevant.

    BTW, I also feel like you are selling FM synthesis short. The OPL chips kinda sucked for plain MIDI, especially with the Windows drivers, and they were never good at reproducing instrument sounds but if you knew how to program them and treated the chip as its own instrument rather than a tool to emulate real world instruments, they were capable of producing beautiful electronic music with a very typical sound signature. You should check out some of the adlib trackers, like AdTrack2 for some examples. Many games also had beautiful FM synthesized soundtracks, and I often preferred it over the AWE32 wavetable version (e.g. Doom, Descent, Dune)




  • if it’s good enough for the majority of historians

    It isn’t. Historians would love to have independent evidence of the existence and crucifixion of Jesus, but there isn’t… so most historians refrain from taking a position one way or the other. The ones that do have to make do with what little objective information they have, and the best they can come up with is: well because of this embarassing thing, it’s more likely that he did exist and was crucified than that he didn’t, because why would they make that up?

    That’s rather weak evidence, and far from “proof”.

    Not sure why you’d need more

    Well for one because the more prominent people who have studied this have a vested interest in wanting it to be true. For example, John P. Meier, who posited this criterion of embarassment that I outlined in my previous comment, isn’t really a historian but a catholic priest, professor of theology (not history) and a writer of books on the subject.









  • We are talking about addresses, not counters. An inherently hierarchical one at that. If you don’t use the bits you are actually wasting them.

    Bullshit.

    I have a 64-bit computer, it can address up to 18.4 exabytes, but my computer only has 32GB, so I will never use the vast majority that address space. Am I “wasting” it?

    All the 128 bits are used in IPv6. ;)

    Yes they are all “used” but you don’t need them. We are not using 2^128 ip addresses in the world. In your own terminology: you are using 4 registers for a 2 register problem. That is much more wasteful in terms of hardware than using 40 bits to represent an ip address and wasting 24 bits.


  • you are wasting 24 bits of a 64-bit register

    You’re not “wasting” them if you just don’t need the extra bits, Are you wasting a 32-bit integer if your program only ever counts up to 1000000?

    Even so when you do start to need them, you can gradually make the other bits available in the form of more octets. Like you can just define it as a.b.c.d.e = 0.a.b.c.d.e = 0.0.a.b.c.d.e = 0.0.0.a.b.c.d.e

    Recall that IPv6 came out just a year before the Nintendo 64

    If you’re worried about wasting registers it makes even less sense to switch from a 32-bit addressing space to a 128-bit one in one go.

    Anyway, your explanation is a perfect example of “second system effect” at work. You get all caught up in the mistakes of the first system, in casu the lack of addressing bits, and then you go all out to correct those mistakes for your second system, giving it all the bits humanity could ever need before the heat death of the universe, while ignoring the real world implications of your choices. And now you are surprised that nobody wants to use your 128-bit abomination.


  • Hmm, I can’t say that I’ve ever noticed this. I have a 3950x 16-core CPU and I often do video re-encoding with ffmpeg on all cores, and occasionally compile software on all cores too. I don’t notice it in the GUI’s responsiveness at all.

    Are you absolutely sure it’s not I/O related? A compile is usually doing a lot of random IO as well. What kind of drive are you running this on? Is it the same drive as your home directory is on?

    Way back when I still had a much weaker 4-core CPU I had issues with window and mouse lagging when running certain heavy jobs as well, and it turned out that using ionice helped me a lot more than using nice.

    I also remember that fairly recently there was a KDE/plasma stutter bug due to it reading from ~/.cache constantly. Brodie Robertson talked about it: https://www.youtube.com/watch?v=sCoioLCT5_o