Market Trend
HR

In the Age of the Authority Big Bang, How Will AI Change Trust and Authority? [AI Ready] #4

The Asker
October 3, 2024

Previous Post|AI-Driven Beer: A Glimpse into Efficient Innovation [AI Ready] #3

Questions organizations should ask themselves in the age of AI

 

Number two.
Does my company earn trust and authority with transparent data?

1.
The end of the age of authority on authority

Invisible, unproven authority is hard to accept

 

One of the topics I've been asked to speak about a lot lately is <how to design your company’s culture and employer branding for Gen Z>. Five years ago, there was a “‘these kids’” backlash against companies because they behaved and thought so differently from older generations, but now that they're filling most of the workforce, it's impossible to ignore their presence. 

As one of those “these kids,” I speak to corporate managers and leadership about what's changing. I believe the concept of authority in organizations is changing, or more specifically, the way authority is earned is transforming.

The dictionary definition of authority is “socially recognized and influential position of authority in a field". In other words, it's a social influence that comes from the respect of others. 

In the past, the only source of authority in an organization was the position, rank, or seniority you were given, meaning that if your boss was even slightly higher than you, you could automatically generate authority by feeling that he or she had been loyal to the company for a long time and had a lot of internal and external networks and experience.

But now we're seeing a culture where that authority alone is not enough to make a real impact. Generational changes in values are a factor, but even more so with the advent of AI technology, which is empowering inexperienced practitioners to solve problems on their own, making them less inclined to seek managerial approval or know-how.

I often refer to this shift as The End of Authority by Authority. Instead of having superficial authority, you have to constantly prove that you have the right skills to have real authority. This is turning the concept of trust and authority upside down for AI talent.

2.
Traditional trust and authority vs. AI-era trust and authority
“For the team leader, there must be a reason.” Now it is tough to say.

How was trust and authority earned in the workplace environment we're used to? 

For the most part, it was based on pre-established R&R, with authority and roles assigned based on position. Trust was built through interactions, intuition, and emotional connections between people in the course of work, and it was common to say, “I'm sure your team leader has a good reason for that”, even if there were internal team decisions you didn't understand.

However, this approach left a lot of room for personal bias and cognitive errors. It was also common for authority to be granted simply based on age or title, regardless of actual performance or ability, which sometimes hindered the effectiveness of the system and often led to frustration among younger talent.

But in the age of AI, this may no longer work. As CEO Paul Hudson noted in the Sanofi story from Part 1, talent in the AI era is saying, “You don't know AI.” and argue that younger generations have better insights when it comes to AI technology.

As a result, they also want to report directly to their bosses rather than having to go to someone else for approval, which is why, unlike in the past, Sanofi has inverted the organizational pyramid and has made second- to fourth-year employees part of the directors' meetings.

What's different about working in an AI-enabled company?

 1.AI technology enables direct decision-making

-Direct decisions will be made by workers without the need for middle managers. This will lead to a horizontal shift in the traditional vertical decision-making structure.

-For instance, marketers will be able to use AI analytics tools to analyze the effectiveness of their campaigns in real time, and modify their strategies on the fly based on the results.

-Things that previously required multiple layers of reporting and approvals can be done more quickly and efficiently.

 

2.Data-driven performance with clear accountability

-In the age of AI, the data of who did what is all there, fostering a culture of accountability.

-Every decision and workflow can be recorded and tracked with data, making performance and accountability clear.

 

3.Transparent, data-driven feedback

-Being able to provide a clear rationale with data builds trust. Subjective feedback like, “This is how it was when I did it a few years ago” will not be convincing.

-Instead, objective and specific feedback like “Based on this data, I think this will work better” will become more important.

-This will require a culture of evaluation and feedback based on real data and outcomes, not simply on seniority or experience.

To summarize, AI technology will enable employees to make decisions quickly and directly without the need for leaders, and only transparent performance and feedback will be reliable because there will be clear data on who did what.

In a May survey, 2,827 Gen Zers voted “performance stealer” as their worst enemy. It beat out bosses who gossip and boss around. This will be harder to do in an AI-powered workplace system and culture because actual performance and contributions will be clearly visible in the data.

3.
Are our organizations and leaders
ready to gracefully embrace this shift?

Industries where AI technology is being adopted very quickly and without much resistance include finance, academia, and law. These are all industries that rely on processing and analyzing massive amounts of data. Because of the overwhelming efficiency and accuracy that AI brings to the table, these industries have been rapidly embracing change despite some concerns.

For example, JP Morgan has developed an AI-powered contract analysis tool called COiN, which uses machine learning algorithms to fast-track the review of thousands of pages of contracts, which the company says can be completed in seconds, compared to more than 360,000 hours of manual work for the same number of legal and financial contracts, and with a significantly reduced chance of error. An article by AI solutions developer Imaginovation highlights a McKinsey & Co. study that found that nearly a quarter of lawyers' work could be automated with AI technology.

The adoption of these AI tools is bound to reduce human input and revolutionize the speed of decision-making. Unless it involves highly strategic and judgmental decisions, the time spent waiting for the boss's red pen will be well spent with a machine learning review. And since everything will be data-driven, decisions backed by more transparent evidence will be empowering. In the end, trust will be gained by machines and data, and power will be rearranged differently.

On the other hand, not everyone is welcoming this change. In some cases, the very people who could be replaced by technology are fiercely opposed to protecting their jobs.

Here's a story I heard from a company that tried to use AI in their game development process and ran into resistance from their existing experts. Even though the results showed that AI could deliver features that relied on specialized human expertise, the project was ultimately abandoned due to strong opposition.

This case illustrates that the adoption of AI is a complex process that involves organizational culture and human psychology, not just technology. In particular, senior members who are used to the old ways of doing things are likely to resist the change, as their years of experience and status are important sources of trust and authority.

So even if AI can objectively lead to more favorable outcomes in terms of cost and performance, how to overcome the psychological obstacles of employees has recently become an essential issue.

Big data expert Dr. Gil Young Song's book, <Era Prediction: The Age of the Hyper Individual>, refers to this phenomenon as the “authority big bang” or “‘liquification of authority,’” and states :

“As generative AI learns and adapts faster, new members of the workforce will be able to see 'I know because I've been there, done that' statements of their superiors as noise and reject.” - Gil Young Song, from <Era Prediction: The Age of the Hyper Individual>

These AI talents will struggle in an environment that operates with traditional authority, and They would find it difficult to view such workplaces as places where they can freely propose innovations and fully utilize their capabilities.

So, is your company ready for this future? I suggest it's time to start asking questions and preparing for an environment where trust and empowerment are based on data, not hierarchy or intuition, and where we can all naturally embrace it.

Wrapping up

Organizations in the AI era need to be efficient at executing innovation, trusted and authorized based on transparent data. This means more than just adopting AI technology, it means changing the organization's decision-making structure, work processes, and culture.

These changes won't be easy, but they're necessary to become the workplace of choice for AI talent. It's time for your team to start asking yourself, “What kind of culture does our company need to evolve to become an employer of choice for AI talent, and what are the roadblocks along the way?”

Over the course of this four-part series, we hope you've been able to imagine what you want your workplace to look like in the AI era and get a little more clear view about it. 

In the next edition, we're going to start talking about recruiting and employer branding, which is one of our specialties at Nutilde. We're also going to add some spices on our own brand ‘The Asker’ and create a more interactive way to communicate with you.

Stay tuned for more from The Asker :)

🔎 The Asker's Lens

Guiding Questions to Move from Insight to Action

1.In what ways are members currently trusted and empowered within our company? What changes do we need to make in the age of AI to ensure that decisions are not based on human subjectivity and superficial opinions?

2.Even if the adoption of AI technology is not aggressive, what more needs to be done to embed and mature a data-driven decision-making culture? What are the expected challenges along this journey?

3.Where might it conflict with existing hierarchies or experience-based authority? How can you harmonize the resistance, if any, that you anticipate?

4.What are the biggest questions for me and my organization after reading sections 1 through 4? How do you prepare for the age of AI, both as an organization and as an individual?

 

Written by : The Asker (Link)

Writer : Dana Jeong | CEO of nutilde

0 Comments

Active Here: 0
Be the first to leave a comment.
Loading
Someone is typing
No Name
Set
says...
4 years ago
This is the actual comment. It's can be long or short. And must contain only text information.
(Edited)
Your comment will appear once approved by a moderator.
No Name
Set
2 years ago
This is the actual comment. It's can be long or short. And must contain only what if text information.
(Edited)
Load More
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Load More

Writer

The Asker
Dana Jeong
|
CEO of nutilde
Founder and CEO of nutilde - a high-performance team building partner for innovative organizations. Obsessed with organizations and believing that the best product is an outstanding team. Currently based in Bangkok, she leads the team nutilde in Seoul and publishes The Asker newsletter.