THANK YOU FOR SUBSCRIBING

Adopting And Driving AI Across an Organization By Orchestrating A Suitable Internal And External Ecosystem
Dr. Yves Gorat Stommel, Deputy Head of Function Evonik Digital, Evonik Digital


Dr. Yves Gorat Stommel, Deputy Head of Function Evonik Digital, Evonik Digital
Digital transformation, an umbrella term covering the adoption of digital methodologies and technologies like blockchain, Internet-of-things (IoT), Augmented and Virtual Reality (AR/VR), and Artificial Intelligence (AI), continues gaining relevance with an increasing number of companies. In the majority of cases, especially in the Business-to-Business (B2B) field, this adoption has not led to traditional non-tech organizations becoming fully or even partially dependent on offering digital products to their customers. Instead, these businesses continue to specialize in offering similar products as before, but, in order to do so efficiently, effectively, economically, and in accordance with customer expectations, digital transformation enablers are being used.
While some of these enablers are relevant only for parts of an organization, others apply more broadly. Take AI: Making use of data to generate (actionable) insights, drive automation, and create new value can be beneficial across an organization. Procurement, research & development, production, marketing & sales, supply chain, finance, human resources, … all stand to gain from deploying AI. This broad applicability brings with it not just advantages, but also challenges: AI demystification, AI skillsets, data governance, data infrastructure, AI tools, organizational alignment, and common processes for developing and deploying applications are just a few of the building blocks that need to be addressed. In order to manage the complexity of this endeavor, at Evonik, a global specialty chemicals company, a strategic framework approach has been implemented that brings together stakeholders across functions and divisions. Together they work on an increasingly seamless landscape for learning about, developing, implementing, and making use of AI for processes, products, and business models. Leveraging not just internal, but also external know-how has been critical to making progress, as in many cases the wheel does not need to be reinvented. Help in many different shapes and sizes is out there – if one wants it.
Accordingly, at Evonik, a broad AI ecosystem has been established, which, however, needs to be continuously adapted, as needs evolve, the technology develops, and the supplier landscape changes.
While the left-hand side of the AI ecosystem (shown in the figure) supports learning, exchanging, and driving the internal strategic framework for enabling AI for Evonik, the right-hand side focuses on the implementation of actual applications to derive value.
Starting from the left, strategic partnerships with large (AI) technology companies allow long-term planning, joint strategic efforts, and learning. Continuously discussing and evaluating technological developments and finding ways to cooperate on some of these within a trustful and stable relationship rather than working on isolated ad-hoc projects, leads to mutual understanding and increasingly targeted results.
Peer exchanges focus mostly on conversations around technology, technology deployment, and organizational set-up between industry peers who often face similar challenges. By discussing, for example, how to make use of specific AI techniques, how to develop and organize internal resources, or what issues are encountered when operationalizing applications, peer exchanges support industry-specific learning and collaboration.
While peer exchanges take place within an industry, consortia allow addressing applications, methods, and development of joint standards within business ecosystems and are often supported by grants. Consortia comprise not just peers, but also, for example, suppliers of hard- and software.
Key to any of the above ecosystem building blocks are the internal resources (data scientists, data engineers, domain experts, and others) that understand current and future benefits and challenges, collaborate with the ecosystem, translate learnings for the home organization, and drive the transfer of these to the company.
The same internal resources are key to building applications in-house and sourcing external ones. While building applications in-house – often making use of open-source code – allows customized solutions and independence from the supplier market, a few challenges are becoming increasingly visible as the number of AI-enabled applications rises. For example, the high-speed progression of AI research and development may lead to the need for continuous updates of internally built applications. And as AI learns from data, re-training of models has to be taken into account, as does the detection of model drift and bias. Furthermore, operationalizing an AI application and sufficiently covering cybersecurity requirements have shown to be challenging even for legacy software companies. As a result, next to building custom applications in-house, an increasing focus will likely be put on utilizing external providers to offer commercial solutions or develop applications.
Where economically viable, making use of commercial suppliers of ready-to-go, secure, managed, and continuously updated applications can significantly reduce effort on the part of the internal resources. The expectation is that – as with regular software – more and more commercial solutions will become available. As was the case with programmed software in the past, internal resources in principle being able to develop AI applications similar to those offered by commercial suppliers will likely not continue to be sufficient justification for actually doing so.
For more explorative or niche applications, working with start-ups can be an attractive option, especially in cases with shifting goal posts that would require a high degree of agility. Both Venture Capital (VC) investments and externally organized industry/start-up match-making formats are used.
Finally, for prospective AI applications that directly support the core competencies and related Unique Selling Propositions (USPs) of an organization, investing in AI research can make economic sense. Always keeping in mind that the resulting applications will most likely only provide an advantage over the competition for a few months. As most traditional non-tech companies do not have the means (e.g. data, tooling, manpower, and skillsets) to carry out AI research in-house, academic partnerships can be suitable vehicles. These partnerships also allow learning opportunities for current AI personnel, as well as recruiting options for future AI resources.
The described AI ecosystem of Evonik does not by itself guarantee success. But neither does the ‘go-it-alone’ approach. With a technology seeing as much hype, investment, and development as AI, there are only a few companies that have the resources and the strategic necessity to keep everything in-house. Chances are, that your organization is not part of that exclusive club. An AI ecosystem suited to your needs might be the way to go.
Weekly Brief
Read Also
Asset Management in ongoing turbulent times - Communication remains key, but a sense of understanding and risk tolerance is vital
There is a storm coming in
Artificial Intelligence regulations and its impact on medical devices
Will data protection law reform open the door to easier international data transfers?
Put your Frontline Teams in the Driving Seat through a Personalized, Customer-Centric Approach
Cybersecurity Enabled by Zero Trust

I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info