top of page

GENERATIVE AI: WHERE TO START A ROADMAP


Audience: Senior management, Business users, Operations lead, Project managers.


In today’s rapidly evolving technological landscape, understanding where your organisation stands in terms of Generative AI readiness is crucial. This article provides a clear and practical framework to help you assess your current capabilities and chart a strategic path forward. By leveraging a Maturity Model, you can identify key areas for improvement and take actionable steps to integrate AI effectively, ensuring your organization remains competitive and innovative in the face of constant change.


Before you create a roadmap

Because the evolution of Generative AI is moving so rapidly most organisations do not know where to start with tackling AI or the potential transformation it offers. Because even the AI industry is not sure where it is going, what we need to build our roadmap is:


  1. to where you are now, and

  2. pick a direction to go in...


Common models for helping establish 'where we are' work through a combination of assessment and benching-marking against a known starting point (or level). For that, we can use a Maturity Model.


numbered stacked wooden-blocks with figures climbing up
how do you build your model?

Where we are going is a bit harder, but that's what establishing your model is all about.


Assessing yourself with a Maturity Model

Defining where you are starting is first step of a journey. Leveraging a well-defined and credible Maturity Model can aid you to rapidly establish a framework to use in planning that journey. What once took days of effort and refinement to create technology Maturity Model(s), now takes minutes to establish a credible baseline.


The following example Maturity Model definition was created in under a minute with Generative AI service, using the initial prompt: “Define a Maturity Model for an organisation adopting Generative AI services. Use a classic maturity model segmentation to distinguish between different levels of readiness, capability and implementation, include items which can be readily used in a self-service checklist before embarking on rolling-out Generative AI capabilities within the organisation, or stepping up between levels. Present the model as a table suitable for senior management or leadership role


 Table: simplified classic maturity model geared towards assessing Generative AI readiness

Model level

Description

Key Characteristics

Checklist Items

  1. Initial

Organisation is aware of Generative AI, but has no formal strategy or implementation

  • Limited understanding of Generative AI

  • No dedicated resources

  • Ad-hoc experimentation

  • Conduct basic Generative AI awareness training

  • Identify potential use cases

  • Assess current AI capabilities

2. Developing

Organisation begins to explore Generative AI applications with some structure

  • Pilot projects initiated

  • Basic governance framework

  • Limited skills development

  • Establish an AI steering committee

  • Develop initial AI ethics guidelines

  • Conduct small-scale Generative AI pilots

  • Begin data readiness assessment

3. Defined

Organisation has a clear strategy and is implementing Generative AI in specific areas

  • Formal AI strategy in place

  • Dedicated AI team

  • Defined processes for AI projects

  • Create comprehensive AI governance framework

  • Implement AI risk assessment procedures

  • Develop AI-specific data management policies

  • Establish partnerships with AI vendors/experts

4. Managed

Organisation has integrated Generative AI into multiple functions with clear oversight

  • Cross-functional AI implementation

  • Robust monitoring and evaluation

  • Advanced AI skills across teams

  • Implement AI auditing and monitoring tools

  • Establish AI Center of Excellence

  • Develop AI-specific procurement guidelines

  • Create AI ethics review board

5. Optimized

Organisation leverages Generative AI as a core capability, continuously improving and innovating

  • AI-driven culture with all functions use AI systems to improve outcomes

  • Continuous improvement of AI systems

  • Leadership in organisation’s sector or industry in AI adoption

  • Implement AI innovation programs

  • Contribute to AI policy development

  • Establish cross-agency AI collaboration initiatives

  • Develop AI-specific performance metrics

While not exhaustive, this approach provided a framework for organisations to assess their current level of readiness for Generative AI and identify some first-steps needed to progress to higher levels of capability. The checklist items for each level offer concrete actions that organisations can take to advance their AI maturity.


As with any model or framework organisations have to consider their specific context, resources, and regulatory environment because AI is not magic. It is also important to note that progression through maturity levels should always be aligned with the organisation's overall strategic goals and public service objectives otherwise there will be little or no reason for an executive or employees to buy-in to the investment required.


Obviously, the levels need fleshing out, and actions need more definition but it's not bad for 1-minute kick-start! ...it did take several minutes to formulate a usable prompt and get context right - but not many.


How do AI Maturity Models differ?

While AI maturity models share some similarities with other technology frameworks, they are distinct in their focus on AI-specific capabilities, ethical considerations, and the need for rapid adaptation to keep pace with this fast-evolving field.


AI Maturity Models differ in several keyways:


  1. Focus on AI-specific capabilities: Unlike general technology or process maturity models, the AI Maturity Model specifically assesses an organisation's ability to develop, implement, and utilise artificial intelligence technologies. It evaluates AI specific areas like data, analytics, intelligent automation, and AI governance.

  2. Rapid evolution: Generative AI is advancing so quickly compared to other technology domains, as a result AI maturity models need to be updated more frequently to keep pace with emerging capabilities and best practices.

  3. Emphasis on experimentation: With the experimental nature of AI, there is a need to place greater emphasis on the ability to conduct AI experiments, testing, and rapid prototyping compared to more established technology domains.

  4. Ethical and responsible AI: AI maturity models should incorporate elements focused on ethical use of AI, algorithmic bias, and responsible AI practices, which are not typically present in other technology maturity models.

  5. Integration of AI across the organisation: Advanced levels of AI maturity involve embedding AI capabilities throughout business processes and using AI to drive innovation, rather than just implementing isolated AI projects.

  6. Holistic approach: AI maturity models tend to take a more comprehensive view, assessing not just technical capabilities but also organisational culture, talent, governance, and strategic alignment around AI initiatives.

  7. Outcome-oriented evaluation: There is a growing emphasis on assessing AI maturity based on business outcomes and value creation, rather than just the presence of certain processes or technologies.

  8. Dynamic and flexible frameworks: With the diverse application of AI across industries, AI maturity models often need to be more adaptable to different organisational contexts compared to more standardised maturity models in other domains.


Where are you today?

Drilling down into the details of each level is the first step on the journey to adoption, but you start to diverge from the generic online responses by understanding what the specific level looks like for your organisation i.e. which key characteristics identify your situation on Level 1 of the maturity model.


For example, most organisations starting out can fit the following:


  1. Awareness and Interest: The organisation has a general understanding of Generative AI and its potential benefits. There is interest among employees and leadership, but no concrete steps have been taken to develop a strategy or implement AI solutions.

  2. Expertise: The organisation may lack in-house expertise in AI and machine learning. There might be a few individuals with some knowledge, but not enough to drive a comprehensive AI strategy.

  3. Ad-hoc Experimentation: Isolated instances of experimentation with Generative AI tools or technologies, often driven by individual employees or small teams. These efforts are not coordinated or aligned with broader organisational goals.

  4. Investment: No significant resources (budget, personnel, or time) are allocated to AI initiatives. Investment in AI is minimal and often exploratory in nature.

  5. Policies or Guidelines: There are no established policies, guidelines, or frameworks for the use of AI within the organisation. This includes a lack of ethical guidelines, data governance policies, and risk management strategies related to AI.

  6. R.O.I.: There is uncertainty about the return on investment (ROI) of AI initiatives, with a lack of clear business cases or success stories to justify a formal AI strategy.

  7. Data Sources: Data within the organisation is often siloed and not readily accessible for AI projects. Often challenges related to data quality, integration, and availability.

  8. Culture (Barriers): Frequently missed is cultural resistance to adopting new technologies, including AI. Employees may be wary due to concerns about job displacement or a lack of understanding of its benefits.

  9. Vendor support: The organisation will largely rely on external vendors or consultants for AI-related knowledge and services. This dependency can hinder the development of internal capabilities and strategy.

  10. Adoption Approach: The organisation reacts to the trends and developments rather than proactively planning and implementing AI initiatives. There is a lack of long-term vision and strategic planning for AI.


If you identify with 5 or more descriptions on this list, you are firmly at the beginning of the journey. Understanding the other (potential) areas identified above will help you building a plan of what to do.


What is the plan?

Once you know where you are today you can move beyond this by prioritising and tackling key tasks. The characteristics of your organisation must change with introducing structure and intent. To move you forward you need a plan of action. Following on from the example AI Maturity Model generated you would need to consider:


  1. Structured Exploration: First steps with structured initiatives like pilot projects - projects designed to test the feasibility and potential benefits of AI in specific areas, operations or tasks.

  2. Initial Investment: modest allocation of resources towards AI initiatives including specific budget, personnel, and time. This investment is focused on pilot projects and initial explorations rather than full-scale implementations.

  3. Pilot Projects: one or more pilot projects to explore the use of Generative AI. Often limited in scope, they are used to gather insights and learnings that inform AI strategies.

  4. Build Internal Skills: While you may have some internal knowledge of AI, the organisation lacks comprehensive expertise. Employees involved in AI projects have basic or intermediate skills, and there is a need for further development and training.

  5. External Collaboration: potential to rely on external vendors, consultants, or partnerships to supplement internal capabilities. These external collaborators provide expertise and support for AI projects and upskilling specific business roles.

  6. Data Utilisation: There may be an effort to leverage existing data for AI projects. However, challenges related to data quality, integration, and accessibility can still exist. A focus on working towards improving data infrastructure as a whole is important.

  7. Policy Development: The organisation is beginning to develop policies and guidelines for AI use. This includes considerations for ethical AI, data governance, and risk management, although these policies will often still be in their infancy.

  8. Employee Engagement: Efforts to engage employees in AI initiatives may include workshops, training sessions, and communication about the potential benefits and opportunities of AI. However, a formal development program for staff may not exist.

  9. Monitoring and Evaluation: Actively monitoring and evaluating the outcomes of pilot projects is vital, and should include assessing the return on investment (ROI), identifying success stories, and learning from any challenges encountered.

  10. Strategic Vision: Internal recognition of the importance of AI in the organisation's long-term strategy. Leadership is beginning to articulate a vision for how AI can drive innovation and competitive advantage, even if a comprehensive strategy is not yet in place.


And after that?

The effort from this point forward is all in the execution and defined outcomes for the organisation. Understanding the scope of what is required, is specific to your situation. It is likely that you will have at least some of the most prevalent AI tools and services as Microsoft is embedding variations of Copilot in to all its services and applications.


Establishing the detail around your organisations structured exploration, pilot projects, governance frameworks and people development is more than just ask Gemini or Copilot to create you one, but it's also not as worrying as today's media would have you believe. Don't be fooled though - there is a lot of learning ahead.


In other articles we will pick out Copilot for Microsoft 365 for illustrating areas, like Structured Exploration from the Maturity Model, and the subsequent actions necessary in adoption. Because the industries are at the beginning of the changes from introduction of AI, so are all the organisations in assessing its potential use - hence focus on leading with a discovery focus to inform your actions.


Assessment

This article was developed using our internal governance framework and operational models for the authors (consultant) assessment. Refinement of the supporting content was using a combination of Perplexity.ai and Copilot for Microsoft 365 using the Spoke corpus.


Observation

Perplexity.ai provided a significant reduction in effort for creating an AI Maturity Model, while providing extensive reference of authoritative sources, where Copilot for Microsoft 365 using a Work focus over Spoke corpus enabled significant speed in locating, summarising and evaluating elements of the plan from our historical content.


The baseline assessment of what Initial maturity level is, and the development of the plan are provided using our Senior Consultant’s experience and expertise.


Comparison to a similar requirement from Jan/Feb 2024, the net output in Aug. 2024 is significantly more comprehensive and less prone to irrelevant/erroneous content being provided to the author. Impact on the effort for the article:


  1. Article: 2-hours work, instead of 1-day+

  2. Image: 1/2 hour with Copilot Designer

  3. Summary quality and accuracy review: 1 experienced person, 1/2 hour.


Result: A practical (validated) outcome that's useful - based on known corpus and proven content.


Want to know what we know? Give us a call!

Looking for guidance in adopting Generative AI in a robust and useful manner? Or interested in learning how to adopt Generative AI into day-to-day office productivity, or even just learn some of the tricks-of-the-trade? Email us at hi@timewespoke.com


About the author: Jonathan Stuckey

Comments


bottom of page