Home News AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge

AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge

0
AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge

Data is the lifeblood of the digital economy, and as recent technologies emerge and evolve, the demand for faster data transfer rates, lower latencies, and better compute power at data centers is increasing exponentially. Recent technologies are pushing the boundaries of information transmission and processing, and  adopting open source technologies might help data center operators maximize their current operations and prepare for the long run. Listed here are some examples of technology driving the demand for prime compute and ways in which open source technology, communities and standards are helping address this demand at scale in a sustainable way.

Artificial intelligence (AI) and machine learning (ML) technologies are revolutionizing various domains similar to natural language processing, computer vision, speech recognition, advice systems, and self-driving cars. AI and ML enable computers to learn from data and perform tasks that normally require human intelligence.

Nonetheless, AI and ML also require massive amounts of information and compute power to coach and run complex models and algorithms. For instance, GPT-3, probably the most advanced natural language models on the planet, has 175 billion parameters and was trained on 45 terabytes of text data. To process such large-scale data sets and models efficiently, AI and ML applications need high-performance computing (HPC) systems that may deliver high-speed data transfer rates, low latencies, and high compute power.

Considered one of the emerging trends in HPC is to make use of specialized processors similar to GPUs or TPUs which might be optimized for parallel processing and matrix operations which might be common in AI and ML workloads. For instance, NVIDIA’s Grace CPU is a brand new processor designed specifically for HPC applications that leverages NVIDIA’s GPU technology to deliver as much as 10 times faster performance than current x86 CPUs. Grace CPU also supports fast interconnects similar to NVLink that enable high-speed data transfer rates between CPUs and GPUs.

Augmented Reality and Virtual Reality

The Apple Vision Pro made tidal waves during its unveiling. Augmented reality (AR) and virtual reality (VR) are two of probably the most immersive and interactive technologies which might be transforming various industries similar to entertainment, education, health care, and manufacturing. AR overlays digital information on top of the actual world, while VR creates a totally simulated environment that users can experience through a headset.

Nonetheless, these technologies also pose significant challenges for data transfer and processing. Because of its recent release, details across the Apple Vision Pro are still pending. Other VR headsets have been available for some time, nevertheless, so we will make some assumptions. For instance, VR headsets similar to Oculus Quest 2 require a high-speed connection to a PC or a cloud server to stream high-quality video and audio content, in addition to tracking and input data from the headset and controllers. The video bitrate, which is the quantity of information transferred per second, will depend on the speed at which the GPU can encode the signal on the PC or server side, and the speed at which the Quest 2 processor can decode the signal on the headset side.

In keeping with Oculus, the beneficial bitrate for VR streaming is between 150 Mbps to 500 Mbps, depending on the resolution and frame rate. Which means that VR streaming requires a much higher data transfer rate than other online activities similar to web browsing or streaming music. Furthermore, VR streaming also requires low latency, which is the time it takes for a signal to travel from one point to a different. High latency may cause laggy or jittery gameplay, which might wreck the immersion and cause motion sickness.

The latency will depend on several aspects similar to the network speed, the space between the devices, and the encoding and decoding algorithms. In keeping with Oculus, the best latency for VR streaming is below 20 milliseconds. Nonetheless, achieving this level of performance just isn’t easy, especially over wireless connections similar to Wi-Fi or 5G.

Open Source Technologies for Data Center Optimization

As recent technologies drive the demand for faster data transfer rates, lower latencies, and better compute power at data centers, data center operators face several challenges similar to increasing power consumption, demanding recent cooling requirements, space utilization, operational costs and a rapid pace of hardware innovation and refresh. To deal with these challenges, data center operators have to optimize their current infrastructure and adopt recent standards and technologies that may enhance their efficiency and scalability.

That is the goal of the Open19 Project, a Sustainable and Scalable Infrastructure Alliance initiative now a part of the Linux Foundation. The Open19 Project is an open standard for data center hardware that is predicated on common form aspects and provides next-generation highly efficient power distribution, re-usable componentry and opportunities for emerging high speed interconnects. The SSIA mission and open standards created through the Open19 project are consistent with the larger industry drive toward efficiency, scalability, and sustainability for the infrastructure that powers our digital lives and communities. The Open Compute Project is one other effort to efficiently support the growing demands on compute infrastructure. This project similarly fosters community-driven collaboration amongst industry partners to develop datacenter solutions, with a deal with 21” server rack sizes typically utilized by large colos and hyperscalers. OCP’s scope also extends to the datacenter facility, in addition to the inner IT components of the servers.

Conclusion

Recent technologies are driving the demand for faster data transfer rates, lower latencies, and better compute power at data centers while communities, governments and corporations deal with resource management and the increased sustainability concerns around water usage, power management and other carbon intensive facets of technology creation, use and deployment. Adopting open source technologies, developed in community driven forums just like the SSIA and Linux Foundation might help data center operators maximize their current operations and prepare for a future that’s more sustainable as they meet the demands of those exciting recent applications.

LEAVE A REPLY

Please enter your comment!
Please enter your name here