AI in the workplace

Across workplaces globally, younger workers are far more likely than older colleagues to adopt AI tools — including generative AI, which can help with tasks like writing, summarising, brainstorming, and data review.

The National Bureau of Economic Research stated “Workplace usage declined with age, from about 34 percent for workers under 40 to 17 percent for those 50 and older.” Meanwhile, Salesforce’s global research on generative AI states that “65 percent of generative AI users are Millennials or Gen Z, and 68 percent of non-users are Gen X or Baby Boomers.”. This research also found that younger users are not just more likely to adopt AI — they’re also more likely to trust it and integrate it into work tasks.

This gap reflects more than just familiarity with new technology. The research suggests that younger generations are not only experimenting with AI tools more frequently but are also developing greater confidence in how these tools can support everyday tasks. For example, the Salesforce study found that around 70% of Gen Z respondents report using generative AI, and over half (52%) say they trust the technology to help them make informed decisions. By contrast, non-users tend to be older and far less familiar with the technology. Salesforce found that 88% of people who have not used generative AI say they are unclear about how it will affect their lives, while 40% say they simply are not familiar enough with the technology to try it.

Now these trends are consistent in UK-specific data, showing that the 18–24 age group has dramatically higher generative AI adoption than older demo­graphics. In fact, a UK deeper dive into adoption patterns according to the Generative AI Adoption report shows almost universal uptake among younger adults and much lower use among older generations.

AI and Nonprofits

Sector-wide nonprofit stats echo the challenges found in charities: high interest, low readiness. Across the nonprofit sector, research consistently shows a similar pattern: organisations recognise the potential of AI but lack the structures, skills, and policies needed to adopt it effectively. A 2025 report from TechSoup and Tapp Network found that 85% of nonprofits express strong interest in using AI technologies such as generative AI and predictive analytics. However, most organisations are still in the early stages of adoption and lack the internal capacity to implement AI strategically.

The 2026 AI marketing and fundraising statistics in nonprofits compiled by Nonprofit Tech for Good – 2026 AI statistics for nonprofits show that:

  • 70 percent of nonprofits believe AI could reduce workload or improve communications.

  • 60 percent say they lack in-house expertise to assess tools.

  • Only 4 percent have AI-specific training budgets.

  • 40 percent say no one in their organisation has AI education or training.

This mirrors the workplace generational divide: interest is strong, but confidence — especially among those who didn’t grow up with AI-native tools — tends to lag . Research across the nonprofit sector supports this gap between enthusiasm and readiness. While 70% of nonprofits believe AI can reduce workload and improve communications, 60% report lacking the in-house expertise to assess AI tools, and 40% say no one in their organisation has formal AI education.

Training gaps reinforce the divide, and as a result, AI experimentation frequently happens informally at the individual level rather than through structured learning or organisational strategy. In practice, this creates a dynamic that many nonprofit teams recognise: younger or digitally native staff are often the first to experiment with AI tools, while more experienced leaders remain responsible for evaluating risks such as data privacy, bias, and donor perception. This can slow adoption even when interest is high. Indeed, 70% of nonprofit professionals report concerns about data privacy and security when using AI, alongside worries about accuracy and bias.

Why Age Differences Matter in Charities

The generational divide isn’t just an academic point, but real implications of UK charities and nonprofits. If younger staff independently adopt tools without governance or training, this can create risks around GDPR, data protection, and safeguarding. Organisations also risk having AI knowledge concentrated in a small group of staff, rather than across the whole team, causing unnecessary skill silos. Older staff could be discouraged from engaging in new AI adoption ideas, which can limit effective collaboration.

Practical Steps you can take to ‘bridge the gap’

  1. 1. Build a shared learning environment by running workshops where both younger and older staff help and teach each other. A good starting point could be an open conversation about AI tool use vs ethical judgment.

  1. 2. You can develop collaborative guidance on responsible AI use aligned with UK law (such as GDPR)

  1. 3. Build AI adoption into your digital strategies that reflect your mission, risks, and policies

  1. 4. You should also track who is using AI, how they are using it, and identify what training or support might be needed

Bridging the Generational Gap in AI Adoption – based on UK nonprofits

The evidence is clear: younger workers are at the forefront of generative AI adoption, and UK charities are not immune to this trend. As in many other sectors, early experimentation with tools such as generative AI often begins with digitally confident staff who feel comfortable testing new technologies and integrating them into everyday workflows. In many nonprofit teams, this means that younger employees are frequently the first to explore how AI can support tasks such as drafting communications, analysing data, or generating ideas for campaigns and fundraising initiatives.

However, this pattern should not be interpreted as a simple generational divide between “AI users” and “non-users.” In practice, nonprofit organisations rely on a wide range of skills and experience. Senior staff often bring deep knowledge of programme delivery, governance, and ethical responsibility — areas that are essential when deciding how AI should and should not be used. Their caution around new technologies can reflect legitimate concerns about issues such as data protection, misinformation, bias, and the safeguarding of beneficiary and donor information.

For UK nonprofits, the goal isn’t to force everyone to be an expert in every AI tool. The challenge is not simply increasing AI adoption, but ensuring that adoption happens responsibly and inclusively. Rather than expecting every employee to become an expert in a rapidly evolving set of tools, organisations may benefit more from creating supportive environments where staff can learn gradually and experiment safely. This includes providing clear guidance on appropriate AI use, offering accessible training opportunities, and encouraging open conversations about both the opportunities and risks of AI.

Importantly, bridging generational differences in AI confidence can also become a source of organisational strength. Younger staff members may contribute curiosity, experimentation, and familiarity with emerging tools, while more experienced colleagues provide strategic oversight, ethical awareness, and institutional knowledge. When these perspectives are combined, organisations are better positioned to adopt AI in ways that align with their values and mission.

Ultimately, the goal for UK nonprofits is not technological adoption for its own sake. Instead, it is about harnessing AI thoughtfully to enhance mission delivery — whether by reducing administrative workload, improving communication with supporters, or helping teams focus more time and energy on the communities they serve. By fostering a culture of shared learning and responsible innovation, charities can ensure that staff across all generations feel supported, safe, and confident in using AI in ways that strengthen their impact.

Your Feedback Matters

What did you think of this text? Take 30 seconds to share your feedback and help us create meaningful content for civil society!


Disclaimers

This piece of resources has been created as part of the AI for Social Change project within TechSoup's Digital Activism Program, with support from Google.org.

AI tools are evolving rapidly, and while we do our best to ensure the validity of the content we provide, sometimes some elements may no longer be up to date. If you notice that a piece of information is outdated, please let us know at content@techsoup.org.

"Bridging the Generational Divide in AI Adoption ", by Joel Hogan, 2026, for Hive Mind is licensed under CC BY 4.0.