Q: From the readings and from your experience, is the technology industry a meritocracy (what does that actually mean)? If it is, then is that a good thing or a bad thing? If it is not, should it try to be?
A: The amount of times I’ve heard the phrase, “but I’ve worked so hard to get here!”, has probably desensitized me from its impact. It’s not that it isn’t genuine nor is it filled with malicious or sarcastic intent. Instead it usually is quite the opposite, filled with compassion and authenticity, but that’s also where the problem lies in considering the technology industry a meritocracy lies.
In the article, “The Capitol of Meritocracy is Silicon Valley, Not Wall Street” by Timothy B. Lee, the main basis behind the opinion that the technology industry is indeed a meritocracy was the distinction between legitimate criteria and fake criteria of merit. While Lee compares the Ivy League and Wall Street’s use of SAT scores and the status of the Ivy League itself to Silicon Valley’s pure criteria of the ability to make great software, I think that naively thinking that software engineering as pure criteria can be dangerous. Just like the same logical fallacy of the very same introductory statement made up above, it’s based off of ignoring that a lot of what’s here and what’s possible today has a foundation built from the many people in the past, and that a lot of success is also circumstantial and luck.
A solid foundation sets up the possibility of hard work paying off for whatever goal is needed or wanted to be achieved. I don’t think many people in this class would have exposure to other’s experiences where they do indeed work hard, day and night, but never see the fruition of their work. If one is never exposed to that sort of experience, then the view of the industry being a meritocracy becomes even more established. For example, a person in high economic status will have more potential for opportunity, more chances at meeting the right people, and more time to develop skills needed than a kid in low economic status whose primary concerns center around surviving than living. I think the time that one is exposed to technology matters too, such as whether it began in college (if college is even presented as an affordable opportunity), in high school that had an AP Computer Science class, or even at a young age where a family could afford a computer or something of the sort to experiment and become familiar with. A journey laden with concrete bridges certainly seems more likely to stay stable than an old wooden rickety one.
In the article, “Why hiring the ‘best’ people produces the least creative results” by Scott E. Page, he states that there is no metric for measuring one’s skill in development, and that trying to measure that is an impossible task. It’s true because usually the needs that need to be met aren’t solely going to be measured by one piece of criteria anyway. It’s also true that breakthroughs, which is what the industry needs and wants, don’t happen from staying in one mindset or background or frame of thought. And that’s brought by different backgrounds, instead of measuring people by one standard.
Hard work does indeed matter, it can make and be the difference between skill in developers in the industry with its ever evolving challenges and changes. So I’m not dismissing the fact that hard work is needed to actually dive in and understand these topics in the technology industry in order to enter the workforce, but I’m not going to dismiss the fact that there are more barriers to entry than one might think to enter it in the first place.