Sponsored by:

Visit AMD Visit Supermicro

Performance Intensive Computing

Capture the full potential of IT

Why M&E content creators need high-end VDI, rendering & storage

Featured content

Why M&E content creators need high-end VDI, rendering & storage

Content creators in media and entertainment need lots of compute, storage and networking. Supermicro servers with AMD EPYC processors are enhancing the creativity of these content creators by offering improved rendering and high-speed storage. These systems empower the production of creative ideas.

 

Learn More about this topic
  • Applications:
  • Featured Technologies:

When content creators at media and entertainment (M&E) organizations create videos and films, they’re also competing for attention. And today that requires a lot of technology.

Making a full-length animated film involves no fewer than 14 complex steps, including 3D modeling, texturing, animating, visual effects and rendering. The whole process can take years. And it requires a serious quantity of high-end compute, storage and software.

From an IT perspective, three of the most compute-intensive activities for M&E content creators are VDI, rendering and storage. Let’s take a look at each.

* Virtual desktop infrastructure (VDI): While content creators work on personal workstations, they need the kind of processing power and storage capacity available from a rackmount server. That’s what they get with VDI.

VDI separates the desktop and associated software from the physical client device by hosting the desktop environment and applications on a central server. These assets are then delivered to the desktop workstation over a network.

To power VDI setups, Supermicro offers a 4U GPU server with up to 8 PCIe GPUs. The Supermicro AS -4125GS-TNRT server packs a pair of AMD EPYC 9004 processors, Nvidia RTX 6000 GPUs, and 6TB of DDR5 memory.

* Rendering: The last stage of film production, rendering is where the individual 3D images created on a computer are transformed into the stream of 2D images ready to be shown to audiences. This process, conducted pixel by pixel, is time-consuming and resource-hungry. It requires powerful servers, lots of storage capacity and fast networking.

For rendering, Supermicro offers its 2U Hyper system, the AS -2125HS-TNR. It’s configured with dual AMD EPYC 9004 processors, up to 6TB of memory, and your choice of NVMe, SATA or SAS storage.

* Storage: Content creation involves creating, storing and manipulating huge volumes of data. So the first requirement is simply having a great deal of storage capacity. But it’s also important to be able to retrieve and access that data quickly.

For these kinds of storage challenges, Supermicro offers Petascale storage servers based on AMD EPYC processors. They can pack up to 16 hot-swappable E3.S (7.5mm) NVMe drive bays. And they’ve been designed to store, process and move vast amounts of data.

M&E content creators are always looking to attract more attention. They’re getting help from today’s most advanced technology.

Do more:

 

 

Featured videos


Follow


Related Content

Need help turning your customers’ data into actionable insights?

Featured content

Need help turning your customers’ data into actionable insights?

Your customers already have plenty of data. What they need now are insights. Supermicro, AMD and Cloudera are here to help.

Learn More about this topic
  • Applications:
  • Featured Technologies:

Your customers already have plenty of data. What they need now are insights.

Data just sits there, taking up costly storage and real estate. But actionable insights can help your customers strengthen their overall business, improve their business processes, and create new products and services.

Increasingly, these insights are based on data captured at the edge. For example, a retailer might collect customer and sales data using the point-of-sale terminals in its stores.

Supermicro is here to help. Its edge systems, including the latest WIO and short-depth servers powered by AMD processors, have been designed to collect data at the business edge.

These servers are powered by AMD’s EPYC 8004 Series processors. Introduced in September, these CPUs extend the company’s ‘Zen4c’ architecture into lower-core-count processors designed for edge servers and form factors.

GrandTwin too

For more insights, tell your customers to check out Supermicro’s GrandTwin servers. They’re powered by AMD EPYC 9004 processors and can run Cloudera Data Flow (CDF), a scalable, real-time streaming analytics platform.

The Supermicro GrandTwin systems provide a multi-node rackmount platform for cloud data centers. They come in 2U with 4 nodes for optimal deployment.

These systems offer AMD’s 4th Gen EPYC 9004 Series of general-purpose processors, which support DDR-5 4800 memory and PCI Express Gen 5 I/O.

Distributed yet united

If you’re unfamiliar with Cloudera, the company’s approach is based on a simple idea: single clouds are passé. Instead, Cloudera supports a hybrid data platform, one that can be used with any cloud, any analytics and any data.

The company’s idea is that data-management components should be physically distributed, but treated as a cohesive whole with AI and automation.

Cloudera’s CDF solution ingests, curates and analyzes data for key insights and immediate actionable information. That can include issues or defects that need remediating. And AI and machine learning systems can use the data to suggest real-time improvements.

More specifically, CDF delivers flow management, edge management, streams processing, streams management, and streaming analytics.

The upshot: Your customers need actionable insights, not more data. And to get those insights, they can check out the powerful combination of Supermicro servers, AMD processors and Cloudera solutions.

Do more:

 

Featured videos


Follow


Related Content

Supermicro celebrates 30 years of business

Featured content

Supermicro celebrates 30 years of business

Learn More about this topic
  • Applications:
  • Featured Technologies:

Supermicro Inc. is celebrating its 30th year of research, development and manufacturing.

At the company, formed in 1993, some things remain the same. Founder Charles Liang remains Supermicro’s president and CEO. And the company is still based in California’s Silicon Valley.

Of course, in 30 years a lot has also changed, too. For one, AI is now a critical component. And Supermicro, with help from component makers including AMD, is offering a range of solutions designed with AI in mind. Also, Supermicro has stated its intention to be a leader in the newer field of generative AI.

Another recent change is the industry’s focus on “green computing” and sustainability. Here, too, Supermicro has had a vision. The company’s Green IT initiative helps customers lowers data-center TCO, take advantage of recyclable materials, and do more work with lower power requirements.

Another change is just how big Supermicro has grown. Revenue for its most recent fiscal year totaled $7.12 billion, a year-on-year increase of 37%. Looking ahead, Supermicro has told investors it expects an even steeper 47% revenue growth in the current fiscal year, for total revenue of $9.5 billion to $10.5 billion. 

All that growth has also led Supermicro to expand its manufacturing facilities. The company now runs factories in Silicon Valley, Taiwan and the Netherlands, and it has a new facility coming online in Malaysia. All that capacity, the company says, means Supermicro can now deliver more than 4,000 racks a month.

Top voices

Industry leaders are joining the celebration.

“Supermicro has been and continues to be my dream work,” CEO Liang wrote in an open letter commemorating the company’s 30th anniversary.

Looking ahead, Liang writes that the company’s latest initiative, dubbed “Supermicro 4.0,” will focus on AI, energy saving, and time to market.

AMD CEO Lisa Su adds, “AMD and Supermicro have a long-standing history of delivering leadership computing solutions. I am extremely proud of the expansive portfolio of data center, edge and AI solutions we have built together, our leadership high-performance computing solutions and our shared commitment to sustainability.”

Happy 30th anniversary, Supermicro!

Do more:

 

Featured videos


Follow


Related Content

Research Roundup: Edge, channel sales, insider risk, AI security, wireless LANs

Featured content

Research Roundup: Edge, channel sales, insider risk, AI security, wireless LANs

Catch up on the latest IT market research, surveys and forecasts. 

Learn More about this topic
  • Applications:

Edge computing is strategic. The IT channel is huge. Insider cyber risks deserve more attention. AI can be used to oversee AI. And enterprises are buying more wireless LANs.

That’s the latest from top IT market research. And here’s your Performance Intensive Computing roundup.

All hail the edge

More than 8 in 10 C-level executives (83%) believe that to remain competitive in the future, their organizations will need to implement edge computing.

Nearly as many (81%) believe that if they fail to act quickly on edge computing, they could be locked out from enjoying the technology’s full benefits.

Those figures come from a new study by Accenture. The report is based on a poll, conducted by the consulting firm late last year, of 2,100 C-suite execs—including 250 CEOs—across 18 industries and 16 countries.

There’s plenty of room for progress on the edge, the Accenture poll finds. Just two-thirds (65%) of companies use edge today. And among these adopters, only half have integrated edge into their digital core.

Edge systems can be enhanced with the cloud. Indeed, Accenture finds that nearly 8 in 10 respondents (79%) say they’ll fully integrate edge with cloud in the next three years.

Channel rules

How important is the IT channel? Very, according to market watcher Canalys.

Canalys expects that this year, partner-delivered IT technologies and services worldwide will total more than $3.4 trillion, or about 70% of the global addressable IT market.

And the market is rising, despite ongoing economic issues. Canalys predicts the worldwide IT market will rise 3.5% this year, for a full-year total of $4.7 trillion.

Some of the biggest growth opportunities this year coming in cybersecurity (with sales forecast to rise 11%), network infrastructure (14%) and public cloud (7.5%), according to Canalys.

There are also big implications for the IT hardware, software and services suppliers that rely on the channel.

“Given the importance of the channel,” says Canalys chief analyst Matthew Ball, “the success of vendors will increasingly rely on their resell, co-sell, co-marketing, co-retention, co-development and co-innovation strategies.”

Insider risk rising

Here’s a new reason to worry: The average annual cost of an insider cyber risk has risen 40% over the last 4 years, reaching $16.2 million. And the average amount of time it takes to contain an insider incident is now a about 3 months (86 days).

That’s according to a new study conducted by the Ponemon Institute on behalf of Dtex Systems, a supplier of risk-management software. Their new joint report is based on a recent survey of 1,075 security and line-of-business professionals at nearly 310 organizations worldwide.

Despite this risk, the survey finds that most organizations are dedicating only about 8% of their overall cybersec budget—the equivalent of $200 per employee—to insider threats.

What’s more, about 90% of the insider-risk budget gets spent after an insider incident has occurred, the survey found. These after-incident costs include containment, remediation, investigation, incident response and escalation.

AI vs. AI?

AI-powered risks may be so stealthy, only another AI system can fight them off.

That’s the sentiment revealed by a new Gartner survey. The research firm finds that about 1 in 3 organizations (34%) now use AI application security tools to mitigate the risks of generative AI. Over half (56%) are exploring such approaches for the future.

These numbers come from Gartner’s most recent Peer Community survey, conducted in April. Gartner collected responses from 150 IT and cybersecurity leaders at organizations that use either GenAI or foundational models.

When asked which risks of GenAI worry them the most, nearly 6 in 10 respondents (57%) said leaked secrets in AI-generated code. About the same number (58%) said they’re concerned about AI generating incorrect or biased outputs.

“Organizations that don’t manage AI risk will witness their models not performing as intended,” says Gartner analyst Avivah Litan. “In the worst case [AI] can cause human or property damage.”

Enterprise wireless LAN heats up

Looking for a new growth market? Consider the enterprise segment of wireless local area networking. In this year’s second quarter, sales in this sector grew 43%, reaching a total of $3 billion, according to market intelligence firm IDC.

The growth rate was even higher in both the United States and Canada. In both countries, Q2 sales of wireless LANs to enterprises rose nearly 80% year-on-year, IDC says.

By contrast, the consumer end of the wireless LAN market declined by 14% year-on-year in Q2, according to IDC.

Driving the enterprise sales are a couple of factors, including an easing of both components shortages and supply-chain disruptions, says IDC researcher Brandon Butler. Another growth factor is the rapid adoption by enterprises of the new Wi-Fi 6 and 6E standards.

 

Featured videos


Follow


Related Content

Tech Explainer: What is the intelligent edge? Part 2

Featured content

Tech Explainer: What is the intelligent edge? Part 2

The intelligent edge has emerged as an essential component of the internet of things. By moving compute and storage close to where data is generated, the intelligent edge provides greater control, flexibility, speed and even security.

Learn More about this topic
  • Applications:
  • Featured Technologies:

The Internet of Things (IoT) is all around us. It’s in the digital fabric of a big city, the brain of a modern factory, the way your smart home can be controlled from a tablet, and even the tech telling your fridge it’s time to order a quart of milk.

As these examples show, IoT is fast becoming a must-have. Organizations and individuals alike turn to the IoT to gain greater control and flexibility over the technologies they regularly use. Increasingly, they’re doing it with the intelligent edge.

The intelligent edge moves command and control from the core to the edge, closer to where today’s smart devices and sensors actually are installed. That’s needed because so many IoT devices and connections are now active, with more coming online every day.

Communicating with millions of connected devices via a few centralized data centers is the old way of doing things. The new method is a vast network of local nodes capable of collecting, processing, analyzing, and making decisions from the IoT information as close to its origin as possible.

Controlling IoT

To better understand the relationship between IoT and intelligent edge, let’s look at two use cases: manufacturing and gaming.

Modern auto manufacturers like Tesla and Rivian use IoT to control their industrial robots. Each robot is fitted with multiple sensors and actuators. The sensors report their current position and condition, and the actuators control the robot’s movements.

In this application, the intelligent edge acts as a small data center in or near the factory where the robots work. This way, instead of waiting for data to transfer to a faraway data center, factory managers can use the intelligent edge to quickly capture, analyze and process data—and then act just as quickly.

Acting on that data may include performing preventative or reactive maintenance, adjusting schedules to conserve power, or retasking robots based on product configuration changes. 

The benefits of a hyper-localized setup like this can prove invaluable for manufacturers. Using the intelligent edge can save them time, money and person-hours by speeding both analysis and decision-making.

For manufacturers, the intelligent edge can also add new layers of security. That’s because data is significantly more vulnerable when in transit. Cut the distance the data travels and the use of external networks, and you also eliminate many cybercrime threat vectors.

Gaming is another marquee use case for the intelligent edge. Resource-intensive games such as “Fortnite” and “World of Warcraft” demand high-speed access to the data generated by the game itself and a massive online gaming community of players. With speed at such a high premium, waiting for that data to travel to and from the core isn’t an option.

Instead, the intelligent edge lets game providers collect and process data near their players. The closer proximity lowers latency by limiting the distance the data travels. It also improves reliability. The resulting enhanced data flow makes gameplay faster and more responsive.

Tech at the edge

The intelligent edge is sometimes described as a network of localized data centers. That’s true as far as it goes, but it’s not the whole story. In fact, the intelligent edge infrastructure’s size, function and location come with specific technological requirements.

Unlike a traditional data center architecture, the edge is often better served by rugged form factors housing low-cost, high-efficiency components. These components, including the recently released AMD EPYC 8004 Series processors, feature fewer cores, less heat and lower prices.

The AMD EPYC 8004 Series processors share the same 5nm ‘Zen4c’ core complex die (CCD) chiplets and 6nm AMD EPYC I/O Die (IOD) as the more powerful AMD EPYC 9004 Series.

However, the AMD EPYC 8004s offers a more efficiency-minded approach than its data center-focused cousins. Nowhere is this better illustrated than the entry-level AMD EPYC 8042 processor, which provides a scant 8 cores and a thermal design power (TDP) of just 80 watts. AMD says this can potentially save customers thousands of dollars in energy costs over a five-year period.

To deploy the AMD silicon, IT engineers can choose from an array of intelligent edge systems from suppliers, including Supermicro. The selection includes expertly designed form factors for industrial, intelligent retail and smart-city deployments.

High-performance rack mount servers like the Supermicro H13 WIO are designed for enterprise-edge deployments that require data-center-class performance. The capacity to house multiple GPUs and other hardware accelerators makes the Supermicro H13 an excellent choice for deploying AI and machine learning applications at the edge.

The future of the edge

The intelligent edge is another link in a chain of data capture and analysis that gets longer every day. As more individuals and organizations deploy IoT-based solutions, an intelligent edge infrastructure helps them store and mine that information faster and more efficiently.

The insights provided by an intelligent edge can help us improve medical diagnoses, better control equipment, and more accurately predict human behavior.

As the intelligent edge architecture advances, more businesses will be able to deploy solutions that enable them to cut costs and improve customer satisfaction simultaneously. That kind of deal makes the journey to the edge worthwhile.

Part 1 of this two-part blog series on the intelligent edge looked at the broad strokes of this emerging technology and how organizations use it to increase efficiency and reliability. Read Part 1 now.

Do more:

 

Featured videos


Follow


Related Content

Tech Explainer: What is the intelligent edge? Part 1

Featured content

Tech Explainer: What is the intelligent edge? Part 1

The intelligent edge moves compute, storage and networking capabilities close to end devices, where the data is being generated. Organizations gain the ability to process and act on that data in real time, and without having to first transfer that data to the a centralized data center.

Learn More about this topic
  • Applications:
  • Featured Technologies:

The term intelligent edge refers to remote server infrastructures that can collect, process and act on data autonomously. In effect, it’s a small, remote data center.

Compared with a more traditional data center, the intelligent edge offers one big advantage: It locates compute, storage and networking capabilities close to the organization’s data collection endpoints. This architecture speeds data transactions. It also makes them more secure.

The approach is not entirely new. Deploying an edge infrastructure has long been an effective way to gather data in remote locations. What’s new with an intelligent edge is that you gain the ability to process and act on that data (if necessary) in real time—without having to first transfer that data to the cloud.

The intelligent edge can also save an organization money. Leveraging the intelligent edge makes sense for organizations that spend a decent chunk of their operating budget transferring data from the edge to public and private data centers, which could be a cloud infrastructure (often referred to as “the core”). Reducing bandwidth in both directions and storage charges helps them control costs.

3 steps to the edge

Today, an intelligent edge typically gets applied in one of three areas:

  • Operational Technology (OT): Hardware and software used to monitor and control industrial equipment, processes and events.
  • Information Technology (IT): Digital infrastructure—including servers, storage, networking and other devices—used to create, process, store, secure and transfer data.
  • Internet of Things (IoT): A network of smart devices that communicate and can be controlled via the internet. Examples include smart speakers, wearables, autonomous vehicles and smart-city infrastructure.

The highly efficient edge

There’s yet another benefit to deploying intelligent edge tech: It can help an organization become more efficient.

One way the intelligent edge does this is by obviating the need to transfer large amounts of data. Instead, data is stored and processed close to where it’s collected.

For example, a smart lightbulb or fridge can communicate with the intelligent edge instead of contacting a data center. Staying in constant contact with the core is unnecessary for devices that don’t change much from minute to minute.

Another way the intelligent edge boosts efficiency is by reducing the time needed to analyze and act on vital information. This, in turn, can lead to enhanced business intelligence that informs and empowers stakeholders. It all gets done faster and more efficiently than with traditional IT architectures and operations.

For instance, imagine that an organization serves a large customer base from several locations. By deploying an intelligent edge infrastructure, the organization could collect and analyze customer data in real time.

Businesses that gain insights from the edge instead of from the core can also respond quickly to market changes. For example, an energy company could analyze power consumption and weather conditions at the edge (down to the neighborhood), then determine whether there's be a power outage.

Similarly, a retailer could use the intelligent edge to support inventory management and analyze customers’ shopping habits. Using that data, the retailer could then offer customized promotions to particular customers, or groups of customers, all in real time.

The intelligent edge can also be used to enhance public infrastructure. For instance, smart cities can gather data that helps inform lighting, public safety, maintenance and other vital services, which could then be used for preventive maintenance or the allocation of city resources and services as needed.

Edge intelligence

As artificial intelligence (AI) becomes increasingly ubiquitous, many organizations are deploying machine learning (ML) models at the edge to help analyze data and deliver insights in real time.

In one use case, running AI and ML systems at the edge can help an organization reduce the service interruptions that often come with transferring large data sets to and from the cloud. Intelligent Edge is able to keep things running locally, giving distant data centers a chance to catch up. This, in turn, can help the organization provide a better experience for the employees and customers who rely on that data.

Deploying AI at the edge can also help with privacy, security and compliance issues. Transferring data to and from the core presents an opportunity for hackers to intercept data in transit. Eliminating this data transfer deprives cyber criminals of a threat vector they could otherwise exploit.

Part 2 of this two-part blog series dives deep into the biggest, most popular use of the intelligent edge today—namely, the internet of things (IoT). We also look at the technology that powers the intelligent edge, as well as what the future may hold for this emerging technology.

Do more:

 

 

 

Featured videos


Follow


Related Content

Supermicro introduces edge, telco servers powered by new AMD EPYC 8004 processors

Featured content

Supermicro introduces edge, telco servers powered by new AMD EPYC 8004 processors

Supermicro has introduced five Supermicro H13 WIO and short-depth servers powered by the new AMD EPYC 8004 Series processors. These servers are designed for intelligent edge and telco applications.

Learn More about this topic
  • Applications:
  • Featured Technologies:

Supermicro is supporting the new AMD EPYC 8004 Series processors (previously code-named Siena) on five Supermicro H13 WIO and short-depth telco servers. Taking advantage of the new AMD processor, these new single-socket servers are designed for use with intelligent edge and telco applications.

The new AMD EPYC 8004 processors enjoy a broad range of operating temperatures and can run at lower DC power levels, thanks to their energy-efficient ‘Zen4c’ cores. Each processor features from 8 to 64 simultaneous multithreading (SMT) capable ‘Zen4c’ cores.

The new AMD processors also run quietly. With a TDP as low as 80W, the CPUs don’t need much in the way of high-speed cooling fans.

Compact yet capacious

Supermicro’s new 1U short-depth version is designed with I/O in the front and a form factor that’s compact yet still offers enough room for three PCIe 5.0 slots. It also has the option of running on either AC or DC power.

The short-depth systems also feature a NEBS-compliant design for telco operations. NBS, short for Network Equipment Building System, is an industry requirement for the performance levels of telecom equipment.

The new WIO servers use Titanium power supplies for increased energy efficiency, and Supermicro says that will deliver higher performance/watt for the entire system.

Supermicro WIO systems offer a wide range of I/O options to deliver optimized systems for specific requirements. Users can optimize the storage and networking alternatives to accelerate performance, increase efficiency and find the perfect fit for their applications.

Here are Supermicro’s five new models:

  • AS -1015SV-TNRT: Supermicro H13 WIO system in a 1U format
  • AS -1115SV-TRNT: Supermicro H13 WIO system in a 1U format
  • AS -2015SV-TNRT: Supermicro H13 WIO system in a 2U format
  • AS -1115S-FWTRT: Supermicro H13 telco/edge short-depth system in a 1U format, running on AC power and including system-management features
  • AS -1115S-FDWTRT: Supermicro H13 telco/edge short-depth system in a 1U format, this one running on DC power

Shipments of the new Supermicro servers supporting AMD EPYC 8004 processors start now.

Do more:

 

Featured videos


Follow


Related Content

Meet the new AMD EPYC 8004 family of CPUs

Featured content

Meet the new AMD EPYC 8004 family of CPUs

The new 4th gen AMD EPYC 8004 family extends the ‘Zen4c’ core architecture into lower-count processors with TDP ranges as low as 80W. The processors are designed especially for edge-server deployments and form factors.

Learn More about this topic
  • Applications:
  • Featured Technologies:

AMD has introduced a family of EPYC processors for space- and power-constrained deployments: the 4th Generation AMD EPYC 8004 processor family. Formerly code-named Siena, these lower core-count CPUs can be used in traditional data centers as well as for edge compute, retail point-of-sale and running a telco network.

The new AMD processors have been designed to run at the edge with better energy efficiency and lower operating costs. The CPUs enjoy a broad range of operating temperatures and can run at lower DC power levels, thanks to their energy-efficient ‘Zen4c’ cores. These new CPUs also run quietly. With a TDP as low as 80W, the CPUs don’t need much in the way of high-speed cooling fans.

The AMD EPYC 8004 processors are purpose-built to deliver high performance and are energy-efficient in an optimized, single-socket package. They use the new SP6 socket. Each processor features from 8 to 64 simultaneous multithreading (SMT) capable ‘Zen4c’ cores.

AMD says these features, along with streamlined memory and I/O feature set, lets servers based on this new processor family deliver compelling system cost/performance metrics.

Heat-tolerant

The AMD EPYC 8004 family is also designed to run in environments with fluctuating and at times high ambient temperatures. That includes outdoor “smart city” settings and NEBS-compliant communications network sites. (NEBS, short for Network Equipment Building System, is an industry requirement for the performance levels of telecom equipment.) What AMD is calling “NEBS-friendly” models have an operating range of -5 C (23 F) to 85 C (185 F).

The new AMD processors can also run in deployments where both the power levels and available physical space are limited. That can include smaller data centers, retail stores, telco installations, and the intelligent edge.

The performance gains are impressive. Using the SPECpower benchmark, which measures power efficiency, the AMD EPYC 8004 CPUs deliver more than 2x the energy efficiency of the top competitive product for telco. This can result in 34% lower energy costs over five years, saving organizations thousands of dollars.

Multiple models

In all, the AMD EPYC 8004 family currently offers 12 SKUs. Those ending with the letter “P” support single-CPU designs. Those ending “PN” support NEBS-friendly designs and offer broader operating temperature ranges.

The various models offer a choice of 8, 16, 24, 48 or 64 ‘Zen4c’ cores; from 16 to 128 threads; and L3 cache sizes ranging from 32MB to 128MB. All the SKUs offer 6 channels of DDR memory with a maximum capacity of 1.152TB; a maximum DDR5 frequency of 4800 MHz; and 96 lanes of PCIe Gen 5 connectivity. Security features are offered by AMD Infinity Guard.

Selected AMD partners have already announced support for the new EPYC 8004 family. This includes Supermicro, which introduced new WIO based on the new AMD processors for diverse data center and edge deployments.

Do more:

 

Featured videos


Follow


Related Content

Research Roundup: AI chip sales, AI data centers, sustainability services, manufacturing clouds, tech-savvy or not

Featured content

Research Roundup: AI chip sales, AI data centers, sustainability services, manufacturing clouds, tech-savvy or not

Learn More about this topic
  • Applications:

Sales of AI semiconductors are poised for big growth. AI is transforming the data center. Sustainability services are hot. Manufacturers are saving big money with cloud. And Americans are surprisingly lacking in tech understanding.

That’s some of the latest IT market research. And here’s your Performance Intensive Computing roundup.

AI chip sales to rise 21% this year

Sales of semiconductors designed to execute AI workloads will rise this calendar year by 20.9% over last year, reaching a worldwide total of $53.4 billion, predicts research firm Gartner.

Looking further ahead, Gartner expects worldwide sales of AI chips in 2024 to reach $67.1 billion, a 25% increase over the projected figure for this year.

And by 2027, Gartner forecasts, those sales will top $119 billion, or more than double this year’s market size.

What’s behind the rapid rise? Two main factors, says Gartner: Generative AI, and the spread of AI-based applications in data centers, edge infrastructure and endpoint devices.

AI transforming data centers

Generative AI is transforming the data center, says Lucas Beran, a market analyst with Dell’Oro Group. Last month, his research group predicted that AI infrastructure spending will propel the data center CapEx to over a half-trillion dollars by 2027, an increase of 15%. (That figure is larger than Gartner’s because it includes more than just chips.) Now Dell’Oro says AI is ushering in a new era for data center physical infrastructure.

Here’s some of what Beran of Dell’Oro expects:

  • Due the substantial power consumption of AI systems, end users will adopt intelligent rack power distribution units (PDUs) that can remotely monitor and manage power consumption and environmental factors.
  • Liquid cooling will come into its own. Some users will retrofit existing cooling systems with closed-loop assisted liquid cooling systems. These use liquid to capture heat generated inside the rack or server, then blow it into a hot aisle. By 2025, global sales of liquid cooling systems will approach $2 billion.
  • A lack of power availability could slow AI adoption. Data centers need more energy than utilities can supply. One possible solution: BYOP – bring your own power.

Sustainability services: $65B by 2027

Speaking of power and liquid cooling, a new forecast from market researcher IDC has total sales of environmental, social and governance (ESG) services rising from $37.7 billion this year to nearly $65 billion by 2027, for a compound annual growth rate (CAGR) of nearly 15%.

For its forecast, IDC looked at ESG services that include consulting, implementation, engineering and IT services.

These services include ESG strategy development and implementation, sustainable operations consulting, reporting services, circularity consulting, green IT implementation services, and managed sustainability performance services. What they all share is the common goal of driving sustainability-related outcomes.

Last year, nearly two-thirds of respondents surveyed by IDC said they plan to allocate more than half their professional-services spending on sustainability services. Looking ahead, IDC expects that to rise to 60% by 2027.

"Pressure for [ESG] change is more prescient than ever,” says IDC research analyst Dan Versace. “Businesses that fail to act face risk to their brand image, financial performance, and even their infrastructure due to the ever-present threat of extreme weather events and resource shortages caused by climate change.”

Manufacturers finally see the cloud

For manufacturers, IT is especially complicated. Unlike banks and other purely digital businesses, manufacturers have to tie IT systems and networks with physical plants and supply chains.

That’s one reason why manufacturers have been comparatively slow to adopt cloud computing. Now that’s changing. In part, as a new report from ABI Research points out, because manufacturers that switch to cloud-based systems can enjoy up to 60% reductions in overhead costs relating to data storage, says James Iversen, an ABI industry analyst.

Iversen predicts that industrial cloud platform revenue in manufacturing will enjoy a nearly 23% CAGR for the coming decade.

Another benefit for manufacturers: The cloud can eliminate the data fragmentation common with external data warehouses. “Cloud manufacturing providers are eliminating these concerns by interconnecting applications bi-directionally,” Iversen says, “leading to sharing and communication between applications and their data.”

How tech-savvy are your customers?

If they’re like most Americans, not very.

A Pew Research Center poll of about 5,100 U.S. adults, conducted this past spring and just made public, found that fewer than a third (32%) knew that large language models such as ChatGPT produce answers from data already published on the internet.

Similarly, only about one in five (21%) knew that U.S. websites are prohibited from collecting data on minors under the age of 13.

Fewer than half of those polled (42%) knew what a deepfake is. And only a similar minority (48%) could identify an example of two-factor authentication.

What tech info do they know? Well, 80% of respondents correctly identified Elon Musk as the boss of Tesla and Twitter (now X). And nearly as many (77%) knew that Facebook had changed its name to Meta.

 

Featured videos


Follow


Related Content

Research Roundup: spending rises on global IT, public cloud and cybersec; 8 in 10 finance firms breached

Featured content

Research Roundup: spending rises on global IT, public cloud and cybersec; 8 in 10 finance firms breached

Catch up on the latest market research on IT spending, public cloud and cybersecurity.

Learn More about this topic
  • Applications:

The worldwide IT market this year will be top $4.5 trillion. Spending on public cloud rose nearly 23% in Q1. And although security spending rose nearly 12% earlier this year, nearly 8 in 10 financial-services firms have suffered a cyber breach.

That’s some of the latest tech market research. And here’s your Performance Intensive roundup.

Worldwide IT market

How big is the worldwide IT market? Big indeed—about $4.7 trillion. That’s the forecast for this year from advisory firm Gartner.

Assuming Gartner’s right, that would mark an increase over last year’s spending of 4.3%.

Some sectors are growing faster than others. Take software. Gartner expects the global software spend will rise 13.5% this year over last, for a worldwide total of $911 billion. Looking ahead to next year, Gartner expects more of the same: software spending in 2024 will rise 14%, exceeding $1 trillion.

The second-fastest growing sector is IT services. For this sector, Gartner predicts spending will rise nearly 9% this year over last, for a 2023 global total of $1.4 trillion. And next year, Gartner expects, services spending will rise by an even higher 11%, totaling $1.58 trillion worldwide.

How about spending on the new hot technology, generative AI? Surprisingly, Gartner says it has not yet made a significant impact. Instead, says Gartner analyst John-David Lovelock, “most enterprises will incorporate generative AI in a slow and controlled manner through upgrades to tools already built into their IT budgets.”

Public cloud

Public-cloud spending is on a tear. Last year, according to market watcher IDC, worldwide revenue for public-cloud services rose nearly 23% over 2021’s level, for a total of $545.8 billion.

The largest segment by revenue was SaaS applications, accounting for more than 45% of the total, or about $246 million. It was followed by IaaS (21% market share), PaaS (17%) and SaaS system infrastructure software (16%), IDC says.

By vendor, just 5 suppliers—Microsoft, AWS, Salesforce, Google and Oracle—collectively captured more than 40% of the 2022 global public-cloud market. The No. 1 spot was held by Microsoft, with a market share of nearly 17%.

Being on top is important. “Most organizations,” says Lara Greden, an IDC researcher, “rank their public-cloud provider as their most strategic technology partner.”

Finance cyber breaches

Cyber breaches used to be rare events. No more. A new report finds that nearly 8 in 10 financial-services organizations (78%) have experienced a cyber breach, cyber threat and data theft.

The report was compiled by Skyhigh Security, a cloud-native security vendor that worked with market researcher Vanson Bourne to poll nearly 125 IT decision-makers in 9 countries, including the U.S. and Canada. Respondents all worked for large financial-services organizations with at least 500 employees.

Why is financial services such a big target for cybercrooks? Because, as Willie Sutton reportedly quipped when asked why he robbed banks, “that’s where the money is.”

Skyhigh’s survey also found that about 6 in 10 financial-services firms store sensitive data in the public cloud, although Skyhigh didn’t correlate that with the high percentage of companies that have been cybercrime targets. But one way to secure cloud data, using a cloud access security broker, is employed by fewer than half the respondents (44%).

Also, more than 8 in 10 survey respondents believe that “shadow IT”—the practice of non-IT business units acquiring tech hardware, software and services without the IT department’s approval or knowledge—impairs their ability to keep data secure.

Cyber spending

All those attacks are certainly not due to a lack of spending. Indeed, global spending on cybersecurity products and services rose by 12.5% year-on-year in this year’s first quarter, according to market watcher Canalys.

Spending growth was fastest among midsize organizations, those with 100 to 499 employees, Canalys finds. Within this group, cybersec spending in Q1 rose 13.5% year-on-year.

Spending rose almost as fast for large organizations, those with 500 or more employees: an increase of 13.3%. For small businesses, those with 10 to 99 employees, cybersec spending in Q1 rose just 7.5%, Canalys says.

Market concentration is evident here, too. Nearly half of all cybersec spending (48.6%) went to just 12 vendors, Canalys finds. Three in particular dominated during Q1: Palo Alto Networks (8.7% market share), Fortinet (7%) and Cisco (6.1%).

By region, Canalys finds, North America to have been the largest market for cybersecurity products and services in Q1, at $9.7 billion. But both EMEA and Latin America saw faster sales growth: 13.4% for EMEA and 15.2% for LatAm, compared with 12.3% for North America.

 

Featured videos


Follow


Related Content

Pages