I feel like I’m about as computer savvy as most gen z. Born in 91, but we was poor, so it was the family dell (that I wasn’t allowed to do much with*) until 2008, got my first laptop in 2009**, it broke almost immediately because poor and cheap, and then got my first smart phone (T-Mobile G1) in 2010, and basically didn’t touch a laptop again until I started school 2020.
I basically started over from scratch at that point, but now I run fedora full time and made myself learn some basic stuff, but I would consider myself pretty tech illiterate.
*Because my brother was caught looking at porn, so computer time was severely cut back. Then I was caught sending sexy messages to someone. And then the final nail in the coffin was when I tried to dual boot it with some Linux distro, I don’t remember, borked it, and we had to wipe the hard drive
**Technically I had a netbook before this, in like 07/08, that I used Wubi to install Ubuntu on, and I loved that. But never got more than browser level into it.
Nope. We shed a lot of mentor-types in the great layoffs after Y2K, and a generation of nerds ran without any oral history and then taught that to their successors.
What they don’t know they don’t know is not only What best-practice is, but Why best-practice is. And there’s little demonstrated effort to adhere.
I look over installation docs that do Very, VERY bad things, for instance. Build processes with no artifact validation, a toxic cargo chain, builds in prod, and so much more.
I can’t blame the devs, as they didn’t learn better. I blame the c-suites who canned the pricy experienced nerds who were also raising their successors properly.
Now we get to re-learn all that at great pain and hope to regain some of what we had before the next board of defectors guts another carefully-rebuilt culture of adequacy.
I feel like I’m about as computer savvy as most gen z. Born in 91, but we was poor, so it was the family dell (that I wasn’t allowed to do much with*) until 2008, got my first laptop in 2009**, it broke almost immediately because poor and cheap, and then got my first smart phone (T-Mobile G1) in 2010, and basically didn’t touch a laptop again until I started school 2020. I basically started over from scratch at that point, but now I run fedora full time and made myself learn some basic stuff, but I would consider myself pretty tech illiterate.
*Because my brother was caught looking at porn, so computer time was severely cut back. Then I was caught sending sexy messages to someone. And then the final nail in the coffin was when I tried to dual boot it with some Linux distro, I don’t remember, borked it, and we had to wipe the hard drive
**Technically I had a netbook before this, in like 07/08, that I used Wubi to install Ubuntu on, and I loved that. But never got more than browser level into it.
Coding-wise I’d hazard that younger generations are on-par or better than my generation. But “jack of all trades” is probably more our wheelhouse.
Nope. We shed a lot of mentor-types in the great layoffs after Y2K, and a generation of nerds ran without any oral history and then taught that to their successors.
What they don’t know they don’t know is not only What best-practice is, but Why best-practice is. And there’s little demonstrated effort to adhere.
I look over installation docs that do Very, VERY bad things, for instance. Build processes with no artifact validation, a toxic cargo chain, builds in prod, and so much more.
I can’t blame the devs, as they didn’t learn better. I blame the c-suites who canned the pricy experienced nerds who were also raising their successors properly.
Now we get to re-learn all that at great pain and hope to regain some of what we had before the next board of defectors guts another carefully-rebuilt culture of adequacy.