Purpose is a word that is often talked about. But what does it mean and how does it impact your business?

Purpose is not just about what you sell, but the value you bring. Not just about the products and services you produce, but your company ethos. The impact of what you sell, how you make it and the effect your company has on the wider community.

Purpose impacts everything from buying decisions to employee productivity. Purpose, therefore, drives not only what you do but how you do it.

A purpose gets to the why of the company. Why do your products or services exist? What value does your company bring? Beyond making a profit, what are you providing?

But why does this matter?

Customers are not just concerned with the performance of the products they buy, but who is providing them. Interest in things like a company’s carbon footprint, the working conditions of employees, even the charities a company donates to influence a customer’s decision to buy.

What you stand for is impacting if people buy from you. So you need to ensure you can demonstrate the value you bring. What your “purpose” is.

Your brand purpose is also important for your employees.

It helps build strategy and focus goals. Helps with decision making. It needs to be more than a PR statement — an ethos that can help drive the company. Something that an employee can look to and ask, does the work I am doing contribute to our purpose?

It promotes an aligned and connected organisation that delivers value. Purpose enhances performance and creates value.

Brand purpose is an important keystone for any business. Brand purpose should influence everything from your strategy to messaging to recruitment.

Your company may already have a brand purpose. If you do, it’s good to revisit it and see if it still reflects the ethos and goals of your business.

There are key questions that you should be asking when thinking about your business that your purpose should answer.

Questions to ask when looking at brand purpose:

  • What does our business help people achieve?
  • How does our business impact the societies we operate in?
  • What value(s) do we have while building our products and services?

Customers are becoming more concerned about what a brand stands for. More likely to check whether your business is doing what you say it does. Performative actions without substance (such as greenwashing) will not go down well with potential customers. Purpose can be aspirational, but it must be honest.

Examples from major companies:

  • Coca-Cola. Our purpose: Refresh the world. Make a difference.
  • HP. We are a technology company born of the belief that companies should do more than just make a profit. They should make the world a better place. Our efforts in climate action, human rights and digital equity prove that we are doing everything in our power to make it so.
  • Nike. Our mission: Bring inspiration and innovation to every athlete* in the world. (*If you have a body, you are an athlete.)

Purpose goes beyond a slogan.

Purpose should tell whoever encounters it what your company stands for and what it provides.

Purpose is a factor in a business’ reach and awareness within a market. It can determine whether a customer buys from you, or not. A driver for success.

Act with purpose.

 

For more information and resources, click here.

Neurodivergent people have many talents that can add value to the IT industry. Autistic people can be strong logical thinkers, highly focused, detailed orientated, reliable and loyal. ADHD people tend to have high energy and strong imagination. Dyslexic people can bring out-of-the-box thinking and pattern recognition. Dyspraxic people tend to be really good at strategic thinking and problem solving, and highly motivated.

There is no question that neurodivergent talent can add value in a variety of technical and business roles in the IT industry. So, we don’t need to ask, “What can neurodivergent people do for the IT industry?” The answer, unequivocally, is a lot.

We need to change the perspective. We need to ask ourselves what we can do better to attract and retain these talents in IT buyer and supplier organisations.

In a previous blog, I talked about how cities should think of how to become autism friendly, including through the intelligent application of technology. In this piece, I’m reflecting on how the technology industry itself can make the workplace more autism friendly.

Making the IT Industry Autism Friendly

According to Digital Scotland, 10% of Scottish people are neurodivergent, but many of them are unemployed. In fact, the UK Office of National Statistics’ research shows that just 22% of autistic people are in any kind of employment, but many more are eager to work. That’s a lot of wasted talent for the IT industry at a time when there’s a dire shortage of talent.

According to our surveys, around 74% of European organisations find it difficult or very difficult to hire technology roles in either line of business or IT. Most importantly, that’s a lot of unaccomplished self-fulfilment and happiness for autistic people.

The good news is that the IT industry has started to pay attention. On the technology buyer side, the Israeli army recruited autistic soldiers for a highly specialised visual intelligence unit. On the supplier side, IBM established the ND@IBM BRG (Business Resource Group), which includes neurodivergent employees and allies in IBM offices across the globe.

SAP, Microsoft, DXC and EY have invested to raise awareness both among their employees through internal webinars and training, and for the overall industry by sponsoring Autism at Work Summits. There are even companies that make neurodivergent talent their core asset, such as Auticon, which provides quality assurance, testing, data science and cybersecurity services with a delivery workforce that includes around 400 autistic consultants in its 20 offices.

There’s a long way to go, but these examples show that a different perspective on autism at work is possible for the IT industry. Companies embracing this new perspective need to consider that matching the skillset of neurodivergent people with the right projects and activities, and raising awareness, are only the first steps in the process.

Success comes from changing recruiting and hiring processes by finding alternatives to one-to-one interviews, which can be a barrier for people with gaps in their social skills. For example, they could combine cognitive written tests with week-long workshops, where psychologists bring candidates together for group work and meals to evaluate their individual soft skills.

Workspaces need to be adapted. Just as an employee in a wheelchair may need a ramp, an autistic person may find a low-light, low-noise environment more conducive to concentrating. Psychologists need to be retained as job coaches to help prevent situations that cause anxiety, based on each individual’s profile, and to facilitate interaction with clients.

Dress codes need to be relaxed for autistic people that may be hyper-sensitive to touch and therefore can’t wear certain fibres. Neurotypical employees must be immersed in teams with neurodivergent people to learn how to interact.

A simple change of language from “I need this deliverable to be completed ASAP” to “I need this deliverable to be completed by tomorrow at 5pm” makes an immense difference for an autistic person. The former quite simply does not make sense, and just creates anxiety. The latter provides a clear deadline that an autistic person can meet.

Neurodivergent talent can bring a different perspective to help IT buyers and suppliers avoid bias when tackling business and technical problems in our fast-paced industry. All we need is a change in mindset to make the IT industry a good place to work for neurodivergent talent.

Massimiliano Claps - Research Director - IDC

Massimiliano (Max) Claps is the research director for the Worldwide National Government Platforms and Technologies research in IDC's Government Insights practice. In this role, Max provides research and advisory services to technology suppliers and national civilian government senior leaders in the US and globally. Specific areas of research include improving government digital experiences, data and data sharing, AI and automation, cloud-enabled system modernization, the future of government work, and data protection and digital sovereignty to drive social, economic, and environmental outcomes for agencies and the public.

AMD made major changes to its CPU and GPU portfolios in 2022. On August 29, AMD unveiled its Ryzen 7000 series of desktop processors powered by Zen 4 architecture and the new socket AM5 platform. These processors have been available globally from September 27. The flagship 16-core AMD Ryzen 9 7950X processor is presently offered at the manufacturer’s suggested retail price (MSRP) of $699.

On November 3, AMD unveiled the Radeon RX 7900 series of graphics cards powered by RDNA 3 architecture. The cards have been available on AMD’s website from December 13, 2022. Leading AMD board partners, including ASRock, ASUS, Biostar, Gigabyte, MSI, PowerColor, Sapphire, Vastarmor, XFX, and Yeston, have offered the cards from mid-December. The Radeon RX 7900 XTX has a starting price of $999, while the AMD Radeon RX 7900 XT has a $899 price tag.

AMD offered samples of the Radeon RX 7900 XTX along with the Ryzen 9 7950X processor, a socket AM5 motherboard, and a DDR5 memory kit. The samples allowed IDC to test how well AMD’s flagship technology platform performed as a completely new system or upgrade over an existing system. IDC also gauged how well the new samples performed over prior generations in terms of performance per watt and computations per U.S. dollar.

What’s New with the AMD Ryzen 7000 Series Desktop CPUs?

The core compute die of the Ryzen 7000 series processor is built on TSMC’s 5nm process node. The input/output die — which now includes basic RDNA 2 graphics capabilities for 2D workloads or diagnostic/troubleshooting purposes — is based on TSMC’s 6nm process.

AMD made a significant change by moving from the previously long-lived AM4 socket platform to the new LGA AM5 socket. Since pins have moved from the CPU to the socket, there is less risk of damage during installations or upgrades. The AM5 platform also supports dual-channel DDR5 memory up to 5200 megatransfers per second (MT/S), in line with JEDEC standards.

AMD has invested in a memory configuration standard it has branded EXPO. EXPO supports advanced pre-configured profile settings designed specifically for AMD memory controllers built into DIMM boards. These pre-configured profiles simplify overclocking for greater bandwidth and tighten timing for the fastest responses and lowest latencies.

The new socket AM5 motherboards have up to 24 PCIe 5.0 lanes and are available with AMD X670E chipsets. In addition to possessing the highest overclocking capabilities, the enthusiast-grade X670 and B650E motherboards support the PCIe 5.0 standard for graphics and storage components. However, PCIe 5.0 support is an optional feature on the mainstream B650 motherboard.

AMD has pledged to support AM5 motherboards through 2025 and at least two subsequent CPU generations. However, support may be provided even longer if the long history of the AM4 series is anything to go by.

What’s New with the AMD Radeon RX 7900 Series GPUs?

The AMD Radeon RX 7900 XTX and Radeon RX 7900 XT graphics cards are the first gaming cards to feature an advanced chiplet design. Like the Ryzen 7000 series, the AMD RDNA 3 generation architecture’s chiplet design combines 5nm and 6nm process nodes. The main graphics compute die (GCD) is built on TSMC’s 5nm process and provides the compute units for rasterization and ray tracing. AMD includes up to six TSMC 6nm memory cache dies (MCDs), each with 16MB of second-generation AMD Infinity Cache technology and a 64-bit wide memory interface. AMD invested in a novel interconnect to link the GCD and MCD chiplets together, enabling bandwidth of up to 5.3TB/s.

The AMD Radeon RX 7900 XTX has 96 compute units. Each has four texture units (for 384 in total), one ray accelerator (96 total), and two artificial intelligence (AI) units (192 total). According to AMD, the performance of the ray tracing units in RDNA 3 architecture is nearly double that of the corresponding units in RDNA 2, while the new AI instructions are nearly three times as many as those in the previous generation.

The AMD Radeon RX 7900 XTX features six MCDs that support 24GB of GDDR6 memory running at 20Gbps over a 384-bit bus. The Radeon RX 7900 XTX also supports USB-C, DisplayPort 2.1, and HDMI 2.1a connectivity with UHBR 13.5, allowing displays with high refresh rates to be connected (up to 480Hz refresh rates on 4K panels and 165Hz refresh rates on 8K panels). A dual media engine supports simultaneous encode or decode streams up to 8K60 for HEVC, as well as the new AV1 codec.

The Test Platform

The test PC hardware components included the AMD Ryzen 9 7950X processor, the Radeon RX 7900 XTX graphics card, the Gigabyte X670E Aorus Master motherboard, and a G.SKILL Trident Z5 Neo 2x16GB DDR5-6000 EXPO memory kit. The Windows 11 main drive was a 1TB GIGABYTE AORUS NVMe Gen4 solid-state drive.

A be quiet! Silent Loop 2 280mm water cooler was installed for the processor, which was paired with a be quiet! STRAIGHT POWER 11 Platinum 850W power supply. A 34″ Dell Gaming S3422DWG monitor — a Quad-HD 3440×1440 display with a 144Hz refresh rate, FreeSync, 10-bit colors, and high dynamic range support — was also utilized.

The reviewer utilized the motherboard’s optimal default settings, set the memory profile to EXPO 6000, and made sure that smart access memory was enabled. No special tuning, optimization, or overclocking was carried out for the tests.

Synthetic Benchmarks and Productivity Performance

PCMark 10 is a comprehensive benchmarking tool that covers the wide variety of tasks performed in the modern workplace. Web browsing, videoconferencing, spreadsheet and word-processing workloads, photo and video editing, and rendering and visualization are some of the tasks tested by PCMark 10.

The gaming test focused on real-time graphics and physics engines of the platform hardware. The 9,186 score the platform achieved was better than 99% of all results produced by PCMark 10.

Blender Benchmark 3.4.0 was used to test the rendering performance of the graphics card. Thanks to the Heterogeneous Interface for Portability — AMD’s compute language for GPUs which Blender Benchmark uses (in contrast to OpenCL, which does not) — the Radeon RX 7900 XTX ranked in the top 10% of all benchmarks, delivering an excellent result.

IndigoBench, an OpenCL benchmark based on Indigo 4’s advanced rendering engine, was used to measure the performance of the Radeon RX 7900 XTX. The AMD Radeon RX 7900 XTX achieved 21.500 M samples/s in the bedroom rendering test and 49.599 M samples/s in the supercar rendering test. According to the published test results, the graphics card was 56% faster than the Radeon Pro W6800 and 51% faster than the 6900 XT.

3DMark Port Royal is a dedicated real-time ray tracing benchmark for gamers. The system’s score of 16,319 was better than 93% of all results and almost double the 8,784 score of the older AMD Ryzen 9 5900X and Radeon RX 6800 XT system.

The system’s 3DMark Time Spy Extreme 4K score of 14,242 was better than 95% of all results. The graphics score of 14,593 was 48% higher than what was achieved by the AMD Radeon RX 6800 XT on the same Ryzen 9 7950X platform.

Gaming Performance

Various old and new video games were tested on the platform, including next-gen versions.

Shadow of the Tomb Raider

This game ran at an average 111 frames per second (fps) at 1440p, registering a minimum of 79fps. The highest graphical settings, as well as AMD’s FidelityFX CAS package, were enabled.

FARCRY6

Far Cry 6 ran at an average 140fps at 1440p, registering a minimum of 123fps. All DirectX Raytracing (DXR) and FidelityFX Super Resolution (FSR) features were enabled during testing.

Cyberpunk 2077

Cyberpunk 2077 ran at an average 67fps at 1440p, registering a minimum of 49fps. Ultra-ray tracing presets and FSR 2.1 features were automatically enabled.

The Witcher 3 Wild Hunt Next-Gen

The Witcher 3 Wild Hunt Next-Gen ran at an average 75fps at 1440p, registering a minimum of 60fps. Ultra-ray tracing presets and FSR 2.1 features were automatically enabled.

Fortnite

The latest Fortnite game runs on Unreal Engine 5.1 and makes use of several next-gen features (such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution). For example, the Lumen Global Illumination and Lumen Reflection features can exploit the Radeon GPU’s hardware-accelerated ray-tracing capabilities. The updated game ran at an average 82fps, with a 65fps minimum.

IDC Opinion and Conclusion

Ryzen 7950X Performance, Power Consumption, and Heat Considerations

With the Ryzen 7000 generation of desktop CPUs, AMD has taken a new approach to delivering maximum performance. Rather than limit power, AMD allows its CPUs to consume as much power as needed within the socket and reach their thermal throttling temperature (for the Ryzen 9 7950x, the throttling temperature is 95°C). With a much higher CPU thermal design power of 170W in comparison to 105W for the Ryzen 9 5950X, the 7950X reached much higher single threads and all core boost clocks, increasing overall performance and speeding up challenging productivity tasks.

However, enhanced cooling is needed to maintain the performance of the processor (this means increased fan noise). The system and cooler’s fans will quickly ramp up to high speeds when the processor is executing a demanding workload.

AMD provides several tools for performance and efficiency customizations. For example, Precision Boost Overdrive (PBO) is a feature that has dramatic impacts on power consumption, heat, and noise as it enables users to set overall socket power. On the test system, PBO had a major impact on the power consumption of the Ryzen 9 7950X, with only minimal drawbacks in terms of performance.

For example, the Ryzen 9 7950X running at 105W delivered 93% of the performance and used just 46% of the power of a higher 230W system setting. When AMD’s suggested 65W “eco mode” setting was used, the test platform delivered 73% of the performance and used 38% of the power of the 230W setting. While this configuration flexibility is a welcome change, AMD should make it simple and easy for users to change PBO socket power profiles on the fly. AMD could work with Microsoft to enable this capability via the power management features of Windows 10 or 11.

Radeon RX 7900 XTX Considerations

On the GPU side, AMD has increased the GPU power draw to 355W. This is a 55W increase over the 300W draw of the Radeon RX 6900 XT and RX 6800 XT but only a 20W increase over the Radeon RX 6950 XT. Overclocking delivered small gains of around 200 points in 3DMARK tests and slightly boosted gaming fps, but at the expense of increased power consumption, heat, and fan noise. The RDNA 3-based Radeon RX 7900 XTX can hit frequencies of up to 3GHz, with additional power available on AIB partner cards that include a third 8-pin PCIe power connector and significantly enhanced cooling systems compared to AMD reference card designs.

As with the Ryzen 9 7950X, reducing the GPU power limit and undervolting the graphics card significantly reduces overall temperatures and noise for only a few percentage drops in gaming performance.

AMD Ryzen 7000 Series Pricing

AMD faced some criticism about the Ryzen 7000X series and AM5 platform at launch. Although the pricing of the CPUs was competitive, the initial AM5 motherboards based on the X670E chipset were limited in choice and very expensive. This was compounded by the high price of DDR5 RAM (in comparison to DDR4).

AMD has since lowered the price of the Ryzen 7000X and introduced more cost effective and lower power non-X series of CPUs (the 6-core Ryzen 5 7600, 8-core Ryzen 7 7700, and 12-core Ryzen 9 7900). AMD board partners have also introduced a wider range of AM5 motherboards based on the X670, B650E, and B650 chipsets. Volume production of DDR5 has also resulted in more cost-competitive memory options.

AMD Radeon 7900 XTX Pricing

The European street price of the Radeon 7900 XTX is around €1,130, which is higher than the American $999 MSRP listed on amd.com, due to fluctuating exchange rates and taxes (e.g., value-added tax). The cards from board partners are in some cases more expensive, chiefly due to their beefed-up coolers, uprated power systems, and the additional margins of the board manufacturers and their channel partners.

It does not make sense for AMD to lower its profit margins, especially on niche flagship products. AMD’s key competitor asks higher prices for cards in the same category. At the same time, customers may not actually benefit from lower prices in some instances due to intermediaries such as channel partners or scalpers.

It’s getting harder to find a bargain these days, but with an MSRP of $999, the RX 7959 XTX reference card is actually $100 cheaper than the previous generation RX 6950 XT, despite having more compute units, more RAM, a wider RAM interface, and a significantly better performance. This should come as good news for high-end gamers.

User Experience

There are a few user experience challenges with the AMD AM5 platform, some of which are created by partners. For example, updating the motherboard BIOS may be difficult for some users. In fact, updating the BIOS of the motherboard in the test system failed a few times — the reviewer only discovered that the older BIOS first needed to be set to a default state before an upgrade could be made after carrying out a bit of reading and research. Fortunately, new standard features (e.g., BIOS flashback) simplify the recovery process when experiments with settings make a system unbootable and clearing CMOS memory does not work.

Another example is when the reviewer found that when games such as Hell Let Loose, Fortnite or Genshin Impact were launched, a sudden surge in frame rates can dramatically bolster power and voltage consumption. Fortunately, AMD has countered this issue with safety mechanisms that turn the device off when such surges are detected. After carrying out various tests and checking the health of system components, the reviewer found that these problems can be solved by establishing a frame rate cap (e.g., of 240fps), using Radeon Software.

Casual users who encounter such issues may feel displeased with the overall experience, particularly if they have to spend hours troubleshooting with systems integrators or manufacturers. They may attribute the problem to the AMD platform, rather than a poor job from the software house, in capping the frame rates.

AMD Advantage

The AMD Advantage framework combines Ryzen 7950X processors, Radeon RX 7900 XTX graphics cards, the AMD Software: Adrenalin Edition application, and smart technologies to deliver the best experiences to gamers and creators. The company works closely with prominent system integrators, including CSL, Cyberpower, eBuyer, Falcon Northwest, Maingear, Origin PC, and XIdax, to bring stable and enhanced experiences to gamers, streamers, and content creators. Customers who want to avoid the headache of troubleshooting issues during PC buildouts or upgrades will be pleased with AMD’s investments in additional testing, certification, and support.

Performance

AMD is innovating on multiple levels. The company is enhancing core architecture for inter-process communication gains, improving chiplet design for scalable manufacturing, making cache enhancements for memory intensive workloads, and supporting process and design improvements for clock speed increases. As a result of these efforts, its new products are 30–50% faster than previous generations. The company is not holding back or milking the market with small, incremental gains. Competition is one of the biggest drivers of innovation. As such, the performance improvements of AMD CPUs and GPUs are benefiting the entire market.

The move to PCIe 5 and DDR 5 will future proof the new platform. The inclusion of technologies — such as EXPO memory configuration profiles — will make performance improvements simpler and more accessible to the wider PC market.

AMD is also investing in backend software development with ISVs. AMD is collaborating with Adobe, Blender, and OBS to improve hardware acceleration across CPUs and GPUs for content creation, productivity, and encoding, decoding, and streaming use cases, thereby adding productivity value to the gaming cards.

Final Words and Conclusion

According to IDC’s monitor tracker, 1440p is the fastest-growing screen resolution. In addition, 3% of users buy 4K and higher-resolution displays. As these users are the target customers for high-end hardware, AMD is on the right track.

However, AMD has increased the power draw of its CPUs and GPUs to deliver a high level of performance. Per IDC tests, power consumption can be dropped significantly in many cases while still maintaining over 90% of peak performance. In turn, this drop can lower noise and energy bills. AMD can provide users with the best of both worlds by making it simple to switch performance profiles depending on the particular workload.

In conclusion, AMD has certainly taken a big step forward with its new generation of CPUs and GPUs. AMD’s 2023 desktop platform is taking 1440p gaming mainstream and making 120–140Hz refresh rates the norm. The new platform also makes 4K gaming possible at 60fps with the highest graphical settings achievable at native resolutions or with FSR. For this and other reasons, including AMD’s well-earned reputation for remedying challenges and aging very well, we have no issues with recommending the platform.

Mohamed Hakam Hefny - Senior Program Manager - IDC

Mohamed Hefny leads market research in EMEA on professional workstation PCs and solutions. He also reports on professional computing semiconductors, processors, and accelerators (CPUs and GPUs), as well as breakthroughs and trends related to the market. In addition, Mohamed is actively involved in AI PC taxonomy and research. He participates in business development projects, contributes to consulting activities, and provides IDC customers with analysis, opinions, and advice.

When organizations in all industries are struggling to attract talent, IDC explores opportunities for dealing with this shortage.

Introduction

Organizations in all industries are struggling to attract talent. The shortage of potential employees is a problem that has plagued the IT sector for years but has possibly never been worse than it is now. In this blog IDC explores opportunities for dealing with this shortage.

Employee Benefits

The most obvious perspective to consider is that of salary. Benchmarking employee expenses will allow your organization to match your peers and stop losing employees over salary competition.

Another benefit of benchmarking salary cost is tackling the possible internal tug of war for budget increases. An independent benchmark report is often useful to convince senior management that additional budget is required, if the benchmark points this out.

However, employees are not motivated by salary alone. For many, satisfaction also comes from working on cutting-edge technology, something that only some IT organizations allow an employee to do. In contrast, maintaining legacy systems at less competitive organizations may not be interesting to IT professionals who love to experience technology. In a benchmark, the technologies maintained by the IT staff are closely examined and compared to peers. IDC identifies key areas to innovate your business’ digital transformation, keeping IT staff engaged at the same time.

Optimize Your Current Environment

Another perspective to take is optimizing the existing situation. If finding new IT talent is challenging, IT management must consider ways to maximize the use of existing employees. With talent being as scarce as it is, management must be fully aware of possible optimizations.

A benchmark will show how teams are performing in terms of productivity and where potential exists.

Because IDC’s data collection methods dive deep into your IT administration and governance, gaps that no doubt exist are discovered and reported on. The results of a benchmark will uncover where your automation is lacking and whether your end users are educated to market conform levels. All of these insights will allow you to deliver more and better IT with the resources that you already have available.

Rationalizing and consolidating your IT environment has many benefits and generally offers an attractive business case. That said, possibly the most interesting result is simply reducing the amount of IT that needs to be managed by the talent that is so scarce. The size and complexity of the IT environment is a large factor in our benchmarks, be it the complexity of the networks, the size of the datacenter services, the setup of the end user workplace, or the amount of contract management and governance required. IDC reports on all of these components and shows the way to reducing unnecessary complexity and size.

Is Outsourcing the Way?

Finally, if the options of increasing budgets, optimizing teams, and reducing complexity are exhausted, outsourcing more of the IT services can be considered. Outsourcing can be a relatively quick answer to a suboptimal internal IT team, but it does not come without its share of challenges. The first step is deciding which IT domains are attractive candidates to place under a contract. In other words, an organization needs a sourcing strategy. This strategy will determine how each part of the IT organization should be sourced and what a fitting roadmap to get there should look like. Prioritization of rationalization projects are also considered, as well as the potential to supplement existing teams with external talent from an IT supplier.

Cost is, of course, an important factor in deciding which sourcing scenarios are feasible for the organization. IDC will provide so called ‘landing zones’ in which the future cost of a sourcing scenario are modeled based on the current IT market. This is essentially a virtual benchmark of your IT organization as if parts of it were outsourced.

If IT is outsourced in some way, the existing organization should also change. External contract governance and service management capabilities need to evolve and a future organizational model needs to be constructed. When transforming the organization, one must also consider whether it is attractive to re-educate the existing teams into roles that are needed in the new organization.

The ongoing war on talent is challenging. This blog, however, has hopefully shown that the tools to navigate this challenge exist. IDC continues to help organizations daily and to us, the current market offers new and exciting ways to help CIOs globally.

 

Generative AI is the buzzword of the day. More specifically, ChatGPT, the OpenAI model that is trained to interact in a “conversational way”.

The dialogue format enables ChatGPT to answer follow-up questions, admit its mistakes and challenge incorrect premises. Of course, like many geeks in the ICT industry and beyond, I have tried it.

It’s quite impressive. Well, besides the fact that it took me a couple of attempts to find the right time of the day when traffic was not so high to cripple access. I asked a couple of questions about my passion, mountaineering and climbing.

The answers were correct, although a bit conservative. For example, when I asked about which multipitch routes I could climb with my level of experience, in Western Canada, the model provided only two options that were exactly in line with my multipitch skills. Instead, I would have appreciated a wider variety of options, some easier and some harder than my skill level, so that I could make a choice.

The model also told me to consult local guides for more information, which indicates that careful ethical principles, like personal safety, are embedded in the design of the algorithm. I then asked about who I should vote for in the upcoming primary to elect the new secretary of the Italian Democratic Party. The answer was that the model can’t express a political opinion, but that it could provide me with the list of candidates.

That’s fair enough, and further proof that ethics are taken into account. So, I asked for the list of candidates and their programmes. The answer was that the model is trained on historical data available until 2021, so it’s not up-to-date on events between 2022 and early 2023. This is understandable, but I would expect it to be quasi real time in the future.

Regardless, fascinating.

Embracing the Augmentative AI Vision

I’ve not done enough research (yet) to say how good the model is and for what use cases. Many of my IDC colleagues are developing thought-leadership research and collecting in-depth data into how generative AI will affect enterprise and consumers.

What I’m thinking about is the societal implications of generative AI. This was triggered yesterday during our first meeting with the 2023 IDC Government Xchange Advisory Board. Gwendolyn Carpenter, a member of the Advisory Board, who has kids in school, said she’d heard about students using it to cheat on their homework.

My colleague Matt Ledger has already written a quick take on this matter too. As Matt noted in his piece, there are a range of opinions on this, from schools that believe ChatGPT can be very valuable as a learning tool, to those that are uncertain about the impact and have temporarily banned usage, to educators that believe that generative AI could make redundant our ability to write, learn and eventually think. This, paradoxically, could hamper our ability to invent brilliant new tools such as ChatGPT itself.

I am no Luddite. I don’t think we should stop progress. But I think generative AI is a great case in point for the ongoing debate about whether we should design AI that can replace human abilities versus AI that can augment human abilities.

For example, I don’t want generative AI to replace my writing, just because it’s much faster and more elegant than I am at synthesising available knowledge. I’m having a lot of fun expressing my opinions in this blog because, in a way, I’m creating it while I write it!

But I would definitely like to have a tool that can critique my writing. A tool that could, for instance, highlight where my piece is biased or where I could consider additional sources of data and literature to enrich my perspective. Sort of a much smarter version of the spell checker that tells me if there’s a typo or if I didn’t use punctuation correctly or if I used too many passive forms. This augmentative AI tool would push my brain to think more, not less. And I’d still be able to make my own choices on whether to apply the advice or not.

Policymakers need to think about how they can shape the new norms to maximise the benefits and tackle the risks of AI. For instance, by recommending (or mandating) a machine-readable label that helps recognise if a piece of content is generated by AI, for example in the case of government-regulated certifications. But regulation is not enough.

If that does not happen, AI will fail to meet the high expectations that it can be a positive force in the future. In fact, according to our Future Enterprise Resilience and Spending Survey (Wave 11, December 2022), only 25% of government executives worldwide think the promise of AI has completely lived up to their organisation’s expectations.

The future of generative AI (and the AI market in general) will depend on whether users and suppliers embrace the human augmentation narrative, in both the B2B and B2C worlds. We need to ask ourselves what kind of AI solutions we want — solutions that replace humans or augment humans. And then design and engineer them in a way that reflects that purpose.

I look forward to discussing more about the power of innovation, and how we can use it at scale to make a positive and ethical impact on society, at our Government Xchange.

Massimiliano Claps - Research Director - IDC

Massimiliano (Max) Claps is the research director for the Worldwide National Government Platforms and Technologies research in IDC's Government Insights practice. In this role, Max provides research and advisory services to technology suppliers and national civilian government senior leaders in the US and globally. Specific areas of research include improving government digital experiences, data and data sharing, AI and automation, cloud-enabled system modernization, the future of government work, and data protection and digital sovereignty to drive social, economic, and environmental outcomes for agencies and the public.

Customers’ raised expectations, government policies, a spike in fuel prices and technology innovation are converging to enable convenient, affordable, safe and environmentally sustainable mobility as a service (MaaS). MaaS solutions help connect the different phases of the door-to-door mobility experience, from planning to booking, payment, navigation and information queries, with seamless integrations across modes of transportation.

MaaS is not new, but it has been plagued by technical interoperability challenges and difficulty in finding the right business models that can push mobility ecosystem stakeholders — transit authorities, car OEMs, payment providers, transport network companies — to collaborate and share data.

Good Practices for MaaS Ecosystem Innovation

IDC research shows that MaaS is reaching an inflection point. Best practices are emerging among public transportation authorities and transportation operators to deliver on the promise of enabling customers to travel in a convenient way, when it suits them and at a reasonable cost.

At the same time, MaaS is enabling transport operators and planners to optimise the use of capital-intensive asset capacity, launch new revenue-generating services and encourage a modal shift to public modes of transport among citizens.

It all starts with the customer. User-centric MaaS apps enable travellers to build their unique mobility profile based on personal preferences, financial profile, physical characteristics and past behaviour. Service providers must recognise, serve and safeguard the individual preferences of each user to deliver truly personalised MaaS offerings.

Cities such as Genoa have deployed mobile-first user apps that provide a single point of access to information and services while on the move.

To book and pay for their journeys directly in the MaaS app, without the need to switch to a transport operator app, stakeholders must share data and define contractual models that benefit the whole ecosystem. In Spain, train operator Renfe has launched a door-to-door booking MaaS solution (the dōcō app) underpinned by a platform that enables actors across the mobility ecosystem to collaborate openly, from micromobility service providers, to ride-sharing apps, to technology manufacturers and payment system providers.

To enable rapid innovation and scale these MaaS data platforms to process, store, integrate and analyse vast swathes of data, transportation ecosystem companies such as Entur in Norway are moving away from monolithic, legacy systems to cloud-native solutions that enable data sharing at scale and agile innovation. 

Once data is aggregated and information is made accessible through platforms, transportation authorities can use it to build a mobility digital twin of the city that can help with traffic forecasting and simulation, traffic/city planning, infrastructure maintenance and asset management, and logistics resource planning. Data sharing can also support the development of new services and businesses. 

 

Further reading:

IDC PeerScape: Practices to Successfully Implement Mobility as a Service

Massimiliano Claps - Research Director - IDC

Massimiliano (Max) Claps is the research director for the Worldwide National Government Platforms and Technologies research in IDC's Government Insights practice. In this role, Max provides research and advisory services to technology suppliers and national civilian government senior leaders in the US and globally. Specific areas of research include improving government digital experiences, data and data sharing, AI and automation, cloud-enabled system modernization, the future of government work, and data protection and digital sovereignty to drive social, economic, and environmental outcomes for agencies and the public.

This is the second blog in IDC’s series focusing on the implications of the EU’s updated Security of Network and Information Systems directive, NIS2. The directive comes into force in January 2023, after which Member States have 21 months to transpose it into their national law – by October 2024.

The broad aim of NIS2 is to engender a high common level of cybersecurity in the EU, across all Member States, in the long term.

The first blog looked at the regional and national entities that are tasked with transposing and implementing the new directive, as well as some of the mechanisms that are being put into place to effect improved cybersecurity across the bloc.

This second instalment looks at which organizations NIS2 will apply to and what will be required of them.

Expanding the Reach

The first NIS directive introduced a clear focus on improving cybersecurity and risk management at critical infrastructure in Europe: energy (electricity, oil, and gas), transportation, drinking water supply and distribution, healthcare, banking and finance, and digital infrastructure (Internet Exchange Points, DNS service providers, and Top-Level Domain (TLD) name registries). These were defined as operators of essential services (OES’s).

The volume and frequency of cyberattacks since the first directive came into force has driven home the message that cybersecurity safeguards and improvements need to be more far-reaching. Industry sectors that may not be viewed as critical may supply components or services to critical infrastructure, from electrical equipment to medical devices. Disruption of food production and distribution or waste management can have a major impact on the function of society. Digital providers such as search engines and online marketplaces are recognized for their universal value.

Consequently, the NIS2 directive extends coverage into all these segments and more. A full list of sectors defined as high criticality or critical is below:

High Criticality Sectors

  • Energy.
  • Transport.
  • Banking.
  • Financial market infrastructures.
  • Health.
  • Drinking water.
  • Waste water.
  • Digital infrastructure.
  • ICT service management (B2B).
  • Public administration.
  • Space.

Other Critical Sectors

  • Postal and courier services.
  • Waste management.
  • Manufacture, production and distribution of chemicals.
  • Food production, processing and distribution.
  • Manufacturing (medical devices, computer, electronic and optical products, electrical equipment, motor vehicles, transport equipment).
  • Digital providers (online marketplaces, search engines and social networks).
  • Research organisations.

Furthermore, it is recognized that it is not only large enterprises that represent a target for cybercriminals or are fundamental to critical services. Consequently, the NIS2 directive also extends the scope to cover midmarket organizations with 250 or more employees and turnover of €10 million or more.

The To-Do List

So, if your organization falls within the sectors covered by NIS2, what requirements are coming your way in the next two years? There are two major aspects to this, detailed in Chapter 4 of the directive, Cybersecurity risk management measures and reporting obligations.

Article 21 of the directive covers the cybersecurity risk management measures and lists the following 10 areas as the minimum recommendation:

  • Policies on risk analysis and information system security
  • Incident handling
  • Business continuity and crisis management
  • Supply chain security
  • Security in network and information systems acquisition, development and maintenance
  • Policies and procedures to assess the effectiveness of cybersecurity risk-management measures
  • Basic cyber hygiene practices and cybersecurity training
  • Policies and procedures regarding the use of cryptography and, where appropriate, encryption
  • HR security, access control policies and asset management
  • MFA, continuous authentication, and secure communications where appropriate

It is likely that most entities within critical infrastructure sectors will already have many of these technologies and measures in place, to some degree. The question will be in the level of detail or prescriptiveness that member states go to when transposing this article into their national legislation.

The directive emphasizes that the implementation of these measures should take into account the state-of-the-art, relevant European and international standards, the cost of implementation, the degree of the entity’s exposure to risks, the entity’s size and the likelihood of occurrence of incidents and their severity, including their societal and economic impact. These considerations should be used to determine appropriate or proportional measures.

Article 23 of the directive covers reporting obligations and requires that in the case of any incident that has a significant impact on the provision of their services, essential and important entities notify their CSIRT or competent authority. An early warning should be submitted within 24 hours of the organizations becoming aware of a significant incident, and a more comprehensive incident notification should be submitted within 72 hours.

Further reporting obligations are detailed within the directive and it will be necessary for all organizations covered by NIS2 to familiarize themselves with these obligations once they have been transposed into their national law.

Conclusion

It is early days still for NIS2 and much will depend on the work done over the next 21 months. Nevertheless, the cyberthreats driving this directive will not wait and the benefits from improved cybersecurity measures will outweigh the risks.

Regardless of the final wording of the local versions of the directive, organizations can benefit from getting up to speed with NIS2 and engaging with the existing cybersecurity authorities within their countries to develop their strategies.

Mark Child - Associate Research Director, European Security - IDC

Associate Research Director Mark Child of IDC’s European Security Group leads the group's Endpoint Security and Identity & Digital Trust (IDT) research for both Western Europe and Central & Eastern Europe. He monitors developments in security technologies and strategies as organizations address the challenges of evolving business models, IT infrastructure, and cyberthreats. Mark's coverage includes in-depth security market studies, end-user research, white papers, and custom consulting.

November 2022 was a busy month for the European Commission, with two major pieces of legislation passed that aim to bolster the cybersecurity and cyber resilience of Member States and at organisations across the bloc.

The first was the Digital Operational Resilience Act (DORA), which covers the finance sector and companies that provide ICT services and infrastructure to financial sector entities. The second was the long-awaited update of the Security of Network and Information Systems (NIS) directive, known as NIS 2.

The broad aim of NIS 2 is to engender a high common level of cybersecurity in the EU, across all Member States, in the long term.

This is the first in a two-part IDC blog series that will focus on the implications of NIS 2.

The Clock is Ticking

The full text of the NIS 2 directive was published in the official journal of the European Union on December 27, 2022, and enters into force 20 days after that (January 16, 2023). Thereafter, Member States will have 21 months to transpose the directive into their national law (by October 17, 2024). What happens between now and then?

Building the Frame(work)

The next 21 months will be critical for the success of NIS 2 as regional and national bodies get to work on transposing the articles of the directive into their national legislation. Who will be responsible for this part of the process?

The prime mover in this respect will be the NIS Cooperation Group, which was established in 2017 to support the first NIS directive. The Cooperation Group comprises representatives of all the EU Member States, the European Commission and the EU Agency for Cybersecurity (ENISA).

The group will provide guidance to the national authorities of the Member States on transposing and implementing the directive. It will also provide guidance, advice and cooperation on numerous related areas including cybersecurity policy initiatives, capacity building, training and awareness, exchange of information and best practices, and vulnerability disclosure. It will also be responsible for defining standards and technical specifications, as well as maintaining a central register of essential and important entities in each country.

A second key group will be a network of computer security incident response teams (CSIRTs) across all the Member States. At least one CSIRT in each country will be designated as a competent authority for various roles including international cooperation and coordination, threat monitoring and analysis, and the provision of incident response and assistance to essential entities.

The third key entity is the European Cyber Crisis Liaison Organisation Network (EU-CyCLONe). Its task is to support coordinated management of large-scale cybersecurity incidents and crises at an operational level. It will also ensure regular exchange of information among Member States and relevant entities within the union. EU-CyCLONe’s role will really crank up once the directive is in place.

Key responsibilities will include:

  • Developing shared situational awareness for large-scale cybersecurity incidents
  • Assessing the impact of large-scale cybersecurity incidents and proposing potential mitigation measures
  • Coordinating the management of large-scale cybersecurity incidents and supporting decision making at the political level

Between them, these organisations, along with the Member States themselves, will be tasked with ensuring that when NIS 2 comes into force at the national level, it is appropriately transposed into national law and the countries are able to put in place the necessary structures and resources.

Kicking the Tyres

One criticism of the first NIS directive was that it lacked teeth. The EC is striving to establish NIS 2 more firmly throughout the bloc and one measure through which it seeks to do this is peer reviews. These are aimed at assessing at a national level the conformity, progress and readiness of the directive. For example, peer reviews will assess:

  • The level of implementation of cybersecurity risk management measures and reporting obligations
  • The level of capabilities, including available financial, technical and human resources
  • The operational capabilities of the country’s CSIRTs
  • The level of implementation of cybersecurity information-sharing arrangements

Peer reviews are to be carried out by designated cybersecurity experts from at least two Member States, at a maximum of once every two years. The experts conducting the reviews are expected to provide reports including recommended improvement on any of the reviewed aspects. Those reports will be submitted to the Cooperation Group and the CSIRTs network where relevant.

Conclusion

These entities and processes should ensure that at a regional and national level the EU and its Member States can develop a higher level of cybersecurity and resilience by adhering to the NIS 2 directive.

The second instalment of this blog series will look at which organisations NIS 2 will apply to and what will be required of them.

Mark Child - Associate Research Director, European Security - IDC

Associate Research Director Mark Child of IDC’s European Security Group leads the group's Endpoint Security and Identity & Digital Trust (IDT) research for both Western Europe and Central & Eastern Europe. He monitors developments in security technologies and strategies as organizations address the challenges of evolving business models, IT infrastructure, and cyberthreats. Mark's coverage includes in-depth security market studies, end-user research, white papers, and custom consulting.

“Humanity has a choice: cooperate or perish. It’s either a Climate Solidarity Pact — or a Collective Suicide Pact”.

COP27, held in Sharm El Sheikh, Egypt, in November 2022, began with this sobering opening statement from UN Secretary-General António Guterres. It set the mood for the two-week conference, which fell well short of meeting its targets. According to the Economist, “There is no way Earth can now avoid a temperature rise of more than 1.5°C. There is still hope that the overshoot may not be too big, and may be only temporary, but even these consoling possibilities are becoming ever less likely.”

Governments need to keep investing to tackle climate change, but they now also need to invest to increase our collective resilience. Since COP26 in 2021, not only has the geopolitical environment changed significantly, but the increase in global temperatures, causing wildfires and flooding, has reminded us of the heavy cost of inaction.

While people expect decisive action from their governments, their leaders seem overwhelmed with different priorities and planned investments.

A Real Test of Leadership

This year, 130 developing countries succeeded in their attempt to add the notion of “loss and damages” to the official COP27 agenda. But with COP now over for another year, that looks like the only success in 2022. Even that still needs to be ironed out, however, and it should also be remembered that it only tackles the consequences and not the causes.

Mahmoud Mohieldin, UN Climate Change High Level Champion for Egypt, reminded us that global warming is not only about changing the way we produce and consume energy, but also about the way we produce food. “Transforming food systems could release back the $12 trillion the world spends on the hidden cost of food, from transportation to fertilisers,” he said. “We could also eliminate nearly all of the 8.5% of emissions that come from agriculture.”

There are many reasons why such important matters were not intensively discussed at COP27, but we believe one of them was the lack of global leadership.

If no leader stands out when there is so much to coordinate and activate, the transformation must come from cooperation and greater transparency in the promises made to lower our emissions and our dependence on fossil energies.

COP28: Climate Data for the Common Good

Next year’s COP will come at the same time as the first report since the Paris Agreement of 2015, as the final biennial reports for developed countries will be multilaterally assessed to complete the final IAR cycle during 2023–2024. It’s hard to believe that the direction set in 2015 — to limit global warming to well below 2°C and preferably to 1.5°C — will be reached by then. It’s also hard to think that we will have concrete data to rely on by then.

Some initiatives with data transparency at their core have already been implemented. We think of the Climate Data Steering Committee, the EU’s Corporate Sustainability Reporting Directive and the One Data Hub. By the time these reporting mechanisms are live, there will be more data to track and report, including the loss and damages funds agreed at COP27.

These reports include the same KPIs and data format to follow up on, however. One goal for government executives will be to agree on a data format for each component of climate change, which will need to be transparent for citizens so that they can hold their governments to account.

Philosopher Günther Anders once explained the notion of the Promethean gap, which refers to the incapacity of the human brain to perceive the danger it might encounter. At the beginning of 2022, IDC revealed that the number 2 challenge for governments when attempting to become more sustainable was the lack of IT tools to measure the impact, which was almost as challenging as the lack of funds. If we need concrete data before we take action, it’s time to understand that when it comes to “cooperate or perish” it’s not too late to make the right choice.

Remi Letemple - Senior Research Analyst, IDC Government Insights - IDC

Remi Letemple leads IDC’s Worldwide Sustainable Transportation and Smart Vehicles Strategies service, where he provides strategic guidance and thought leadership on the future of mobility and transportation. Operating at a global level, he is recognized as a subject matter expert in smart mobility and transportation technologies—including connected, autonomous, shared, and electric mobility—enabled by software-defined vehicle (SDV) architectures, over-the-air (OTA) updates, cloud and edge platforms, and AI, including generative AI.

This year’s Enlit Europe, which took place between November 29 and December 1 in Frankfurt, attracted almost 18,000 visitors and 1,000 exhibitors from 100 countries — proving once again to be a reference point for the European (if not worldwide) utility sector.

Sessions on flexibility, energy transition, and digitalization, as well as numerous hub sessions, provided a great opportunity for knowledge sharing during the three-day event. Here are our key takeaways from discussions and debates with technology providers and utilities.

  • European power DSOs are feeling the pinch due to accelerating demand for electrification and distributed generation. One DSO from the DACH region we talked to said it expects requests for PV connections to increase fourfold this year over 2021 in power terms. A Scandinavian operator said it needed to deploy as much capacity by the end of the decade as it had built over the past century. This was expected, of course, as distribution is where most of the energy system transformation is taking place. But things have now spread to a large and diverse cross-section of the power distribution world and DSOs don’t want to become the bottleneck of the energy transition. Distributors urgently need tools to shed light on the LV level of their grid — for planning, operations, and maintenance purposes — and marketplaces to access and procure flexibility in coordination with fellow DSOs and TSOs.
  • Despite the events of the past 10 months, consumers still appear to be an afterthought for most energy suppliers and utilities (and numerous governments) across Europe. With energy and related energy costs top of mind for most customers, it was a great opportunity for companies to create awareness and educate customers on the energy transition, and the critical role they play in making it a reality. But that opportunity has been squandered, with companies failing to deliver on what matters most to customers: high-bill alerts and personalized, meaningful energy efficiency advice. Due to skyrocketing energy prices, energy suppliers are significantly worse off than before in terms of customer satisfaction and net promoter scores. By failing to support customers at a time of need, utilities have failed to change the narrative around them and become trusted energy advisors in the energy transition.
  • As the energy transition accelerates, partnering and co-innovation are becoming critical tools to accelerate the development of solutions designed to respond to this acceleration. These are no longer buzzwords on slideware. Co-innovation between utilities and solution providers is happening on the ground and it is slashing time to market by a factor of three on average. There are hardly any strategic product initiatives by established software providers in this space that are not driven in cooperation with a carefully curated group of end users, leveraging design thinking and agile principles. Partnering between the incumbent enterprise and operational software vendors in the utilities space and their specialty counterparts has also accelerated significantly, offering a new procurement paradigm that combines what we call a platform approach to operations with a new wave of best-of-breed.
  • The industry mantras of electrification, decarbonization, and energy transition continue to be recited despite the impact of the ongoing energy crisis. While the criticality of climate change can’t be neglected, it appears to some extent that the energy crisis has dampened the urgency for some companies and the industry as a whole to invest in making grids reliable for what’s to come. This is a concern, as some areas are already at risk of bottlenecks, as uptake of EVs, heat pumps, etc. increases. There are numerous European initiatives to foster electrification, such as “Fit for 55,” which will end the sale of new CO2-emitting cars in Europe by 2035, and “REPowerEU,” which aims to install 50 million heat pumps by 2030. But this begs the question: Where are we going to get all this power from”

The overall impression is that of an industry chugging along, conscious that it can’t do it alone and increasingly reliant on its partners and innovation with other sectors. We have seen pockets of real disruptive innovation, but for the most part the industry feels a bit weary, and understandably so.

Here’s to brighter times when we meet in Paris at next year’s Enlit Europe.