The most common question I get is "how will my job change with AI?" I wrote this post to help. Think of this as a complete handbook for what to expect as AI brings changes for building roles in tech!
In cybersecurity, AI now dominates deviation monitoring, detecting and triaging unexpected events orders of magnitude faster than manual processes. Security tickets are generated, analyzed, resolved, or escalated more rapidly than even highly skilled Tier 1 or 2 teams could achieve. My concern is: How will future Tier 3 personnel acquire the hands-on experience required to master their roles if AI automates foundational tasks?
This formula stood out for me: value = creativity, strategic thinking, ethics, and problem-solving. Workforce optimization and re-purposing is a tricky thing, especially at scale. As a society, we have achieved Universal Not-so-basic Income, so as a leader, I can do the same "meaningful" work with 10% of people. Should I? I was told "no" in pretty direct ways.
Someone once observed that Western cultures have traditionally tended to value specialization of skills over generalization as evidenced by the saying "Jack of all trades, master of none" while the Japanese have a saying of "I have many knives, and they are all sharp." Perhaps AI can be thought of as the universal knife sharpener, and we must now confront the challenge of learning how to use many different knives. I like your analysis and guidance. Very timely too!
I have a boutique agency focusing on creative advertising and finishing for the entertainment industry. I can see in a relatively short period of time not needing coordinators to manage workflow. With the introduction of “operators” it’s almost possible today. I’ve been in this space for the last 25 years and have worked at really small agencies (5 people or less) and big ones with 300+ employees. I’ve always seen the value in being small and nimble as long as you remain effective. After COVID I was forced into taking my ideas and making them a reality. With the proliferation of AI in the last 3 years my dreams are all coming to fruition. Scary and exciting to say the least but something I am prepared for.
AGI is a thought error since if you can capture All human probability in a function you violate quantum mechanics and bad things will happen. If you can answer all possible human questions then there is no need for any human left alive from the ai perspective.
In cybersecurity, AI now dominates deviation monitoring, detecting and triaging unexpected events orders of magnitude faster than manual processes. Security tickets are generated, analyzed, resolved, or escalated more rapidly than even highly skilled Tier 1 or 2 teams could achieve. My concern is: How will future Tier 3 personnel acquire the hands-on experience required to master their roles if AI automates foundational tasks?
Yeah this is exactly what I'm thinking of when I noted we need ways to keep incentivizing junior employee hiring. It's going to be a problem!
This formula stood out for me: value = creativity, strategic thinking, ethics, and problem-solving. Workforce optimization and re-purposing is a tricky thing, especially at scale. As a society, we have achieved Universal Not-so-basic Income, so as a leader, I can do the same "meaningful" work with 10% of people. Should I? I was told "no" in pretty direct ways.
Nice write up Nate! Thanks for sharing, this is an important topic 🙏
glad you liked it! It’s better to just talk about it than to avoid it lol
Someone once observed that Western cultures have traditionally tended to value specialization of skills over generalization as evidenced by the saying "Jack of all trades, master of none" while the Japanese have a saying of "I have many knives, and they are all sharp." Perhaps AI can be thought of as the universal knife sharpener, and we must now confront the challenge of learning how to use many different knives. I like your analysis and guidance. Very timely too!
I have a boutique agency focusing on creative advertising and finishing for the entertainment industry. I can see in a relatively short period of time not needing coordinators to manage workflow. With the introduction of “operators” it’s almost possible today. I’ve been in this space for the last 25 years and have worked at really small agencies (5 people or less) and big ones with 300+ employees. I’ve always seen the value in being small and nimble as long as you remain effective. After COVID I was forced into taking my ideas and making them a reality. With the proliferation of AI in the last 3 years my dreams are all coming to fruition. Scary and exciting to say the least but something I am prepared for.
AGI is a thought error since if you can capture All human probability in a function you violate quantum mechanics and bad things will happen. If you can answer all possible human questions then there is no need for any human left alive from the ai perspective.
In practice for all roles: "obsess over what you can ship."
Where do you see the role of the Engineering Manager going? Should one start bleeding into product more and more?
Visible deliveries with measurable impact on end users are good to focus on.