#CyberInfrastructure
Explore tagged Tumblr posts
Text
🌐 What is Network Engineering?
Network Engineering is the backbone of our connected world — from data centers to cloud computing, from Wi-Fi to global internet infrastructure. These engineers design, build, and manage networks that keep the world running.
Whether it’s securing data, optimizing speed, or ensuring 24/7 connectivity, Network Engineers are the unsung heroes of the digital age.
📡 Discover how they shape communication and innovation — read the full article now!
#NetworkEngineering#CyberInfrastructure#ITCareers#DataNetwork#InternetBackbone#EngineeringTheWeb#CloudNetworking#STEMCareers#DigitalWorld#ConnectivityExperts#EngineersHeaven#FutureOfIT
0 notes
Link
0 notes
Text
Purdue forestry professor cultivates cyberinfrastructure for collaborative forestry research
Read the full story from Purdue University. While most scientific research fields maintain open-access data policies, access to forestry data remains limited. “The utmost hurdle for the global community to conduct forestry and forest ecology studies, at a global scale especially, is lack of data. This has been a prominent problem for decades,” said Jingjing Liang, associate professor of…
View On WordPress
0 notes
Text
Scientists Using Supercomputers to Scale Up Their Computation Level - Technology Org
New Post has been published on https://thedigitalinsider.com/scientists-using-supercomputers-to-scale-up-their-computation-level-technology-org/
Scientists Using Supercomputers to Scale Up Their Computation Level - Technology Org
What will the future bring for U.S. scientists using supercomputers to scale up their computations to the highest level? And what technologies should cyberinfrastructure providers deploy to match their ambitions?
Paul Woodward, University of Minnesota, describes star convection simulations using TACC’s Frontera supercomputer. Image Credit: TACC.
These questions and more were explored at the 3rd annual Frontera User Meeting August 3-4, 2023, held at the Texas Advanced Computing Center (TACC).
“It’s a great opportunity to hear about how Frontera is performing and for users to hear from each other about how they’re maximizing the system,” said Dan Stanzione, executive director of TACC and the principal investigator of the National Science Foundation (NSF)-funded Frontera supercomputer.
Frontera is the most powerful supercomputer ever deployed by the NSF, and it’s the fastest U.S. academic system according to the latest (June 2023) Top500 rankings. Frontera serves as the leading capability system in the national cyberinfrastructure intended for large applications that require thousands of compute nodes.
Scientists tour the data center with TACC’s Frontera supercomputer. Image Credit: TACC.
Over the past 12 months, Frontera has provided rock steady service with 99 percent uptime and an average continuous utilization of 95 percent of its cycles. It delivered more than 72 million node hours and completed over one million jobs, with a cumulative completion of more than 5.8 million jobs over its four years of life.
As of September 2023, Frontera has progressed through more than 80 percent of its projected lifespan with new technology coming that will extend its operation through late 2025.
Approximately 30 scientists participated in the 2023 Frontera User Meeting. The event featured 13 invited speakers who shared their recent experiences and findings while utilizing Frontera.
The presentations included projects taking advantage of the many ways that users get allocations on the system. Some focused on smaller “startup” activities for groups beginning the transition to very large-scale computing. Others, such as the Large-Scale Community Partnership allocation, are long-term collaborations with major experimental facilities and require over a million node-hours of computing resources.
Other presentations focused on more extensive initiatives, such as the Leadership Resource Allocations, which received up to five million node-hours of computational support. Additionally, certain awardees, known as Texascale Days recipients, were granted access to Frontera’s full capacity, including its impressive 8,000+ nodes.
The presentations encompassed many domains of science ranging from cosmology to hurricanes, earthquakes to the memory center of the human brain, and more. All credited the unprecedented access to computing resources at the scale provided by Frontera as a cornerstone in allowing new understanding and discoveries in cutting-edge research.
Hurricane Storm Surge
Eirik Valseth, a research associate in the Computational Hydraulics Group, Oden Institute of UT Austin, described new work on Frontera to develop compound storm surge models that add river flooding effects with extreme resolution for the Texas coast.
His group is also using Frontera to generate five-day hindcast and seven-day forecasts for global ocean storm surge in collaboration with The University of Notre Dame and the U.S. National Oceanic and Atmospheric Administration, in efforts to allow better planning for hurricanes.
Simulation snapshot generated by Frontera of new model that combines storm surge and river flooding data along the Texas coast. Image Credit: Eirik Valseth, Oden Institute.
Big One Along the San Andreas Fault
Yifeng Cui, the director of the High Performance GeoComputing Laboratory at the San Diego Supercomputer Center (SDSC), described nonlinear earthquake simulations performed by his team on Frontera during TACC’s Texascale Days.
The simulations scaled up to 7,680 nodes and ran for 22.5 hours to simulate 83 seconds of shaking during a magnitude 7.8 quake on the southern San Andreas fault. More accurate simulations allow communities to plan better to withstand these large earthquakes, thus saving lives and property.
Snapshot of ShakeOut Scenario simulations on Frontera of M7.8 Earthquake on Southern San Andreas Fault. Image Credit: Yifeng Cui, San Diego Supercomputer Center.
Child Brain Development
Jessica Church-Lang, an associate professor in the Department of Psychology at UT Austin, is using Frontera to analyze anonymized fMRI image data of brain activity in children to find connections between its various systems including control, visual, motor, auditory, and more.
Frontera has helped her to construct 3D brain models from the fMRI images. “It takes about five hours, per child, on Frontera to run the analysis. It used to take three days on older computers. And this is just one step of our processing pipeline.”
Brain Bubbles
Frontera is helping scientists probe the mysteries of how the brain forms thoughts in research led by Jose Rizo-Rey, a professor of biophysics at UT Southwestern Medical Center. His research, using all-atom molecular dynamics simulations on Frontera, investigates tiny bubbles called “vesicles” that shuttle neurotransmitters across the gap between neurons, carrying the signal the brain uses to communicate with itself and other parts of the body.
“The process of fusion can happen in just a few micro seconds,” Rizo-Rey said. “That’s why we hope that we can simulate this with Frontera.”
Simulation of vesicle-flat bilayer interface of membrane fusion. Image Credit: Jose Rizo-Rey, University of Texas Southwestern Medical Center.
Memories, Models, and Optimizations
Research Engineer Ivan Raikov, Department of Neurosurgery at Stanford University, presented his progress on developing a large-scale model of the rodent hippocampus, a region of the brain associated with short-term memory and spatial navigation.
The project is creating the first-of-its-kind biophysically detailed, full scale model of the hippocampal formation with as close as possible to a 1-to-1 scale representation of every neuron. “We start with a full-scale hippocampal model with one million neurons,” Raikov said. “It takes about six hours to simulate 10 seconds of hippocampal activity on 1,024 nodes of Frontera.”
Turbulent Times
P.K. Yeung, professor of aerospace engineering at Georgia Tech, presented his work using Frontera to study turbulent dispersion, an example of which is the spread of a candle’s smoke or how far disease agents travel through the atmosphere.
Yeung’s simulations on Frontera track the motion of systems of more than a billion particles, calculating the trajectory and acceleration of each fluid element passing through a turbulent, high-rotation zone in what is known as Lagrangian intermittency in turbulence.
Sample trajectory and acceleration of fluid element passing through high-rotation zone illustrating Lagrangian intermittency in fluid turbulence. Image Credit: PK Yeung, Georgia Tech.
Star Turnover
Paul Woodward, the director of the Laboratory for Computational Science & Engineering and a professor in the School of Physics and Astronomy, University of Minnesota, performed 3D hydrodynamical simulations on runs of up to 3,510 compute nodes on Frontera of rotating, massive, main sequence stars to study convection in the interior of the star.
“Frontera is powerful enough to permit us to run our non-rotating simulation forward in time for about three years, which is an amazing thing to have done,” Woodward said.
Black Hole Cosmology
Simeon Bird, an assistant professor in the Department of Physics & Astronomy, UC Riverside, presented a new suite of cosmological simulations called PRIYA (Sanskrit for ‘beloved’). The PRIYA simulations performed on Frontera are some of the largest cosmological simulations performed, needing over 100,000 core-hours to simulate a system of 3072^3 (about 29 billion) particles in a ‘box’ 120 megaparsecs on edge, or about 3.91 million light years across.
“We run multiple models, interpolate them together and compare them to observational data of the real universe such as from the Sloan Digital Sky Survey and the Dark Energy Spectroscopic Instrument,” Bird said.
The PRIYA cosmological suite developed on Frontera incorporates multiple models with different parameters to form some of the largest cosmological simulations to date. Image Credit: Simeon Bird, UC Riverside.
Space Plasma
Half of all the universe’s matter — composed of protons and neutrons — resides in space as plasma. The solar wind from stars such as our sun shapes clouds of space plasma. And on a much larger scale, cosmic magnetic fields knead space plasma across galaxies.
“Some of our recently published work has made use of Frontera to study the turbulent dynamos in conducting plasma, which amplify cosmic magnetic fields and could help answer the question of the origin of magnetic fields in the universe,” said graduate student Michael Zhang, Princeton Program in Plasma Physics, Princeton University.
Tight Junctions
Tight junctions are multiprotein complexes in cells that control the permeability of ions and small molecules between cells, as well as supporting transport of nutrients, ions, and water. Sarah McGuinness, a PhD candidate in biomedical engineering at the University of Illinois, Chicago, presented progress using molecular dynamics simulations on Frontera to research Claudin-15, a protein which polymerizes into strands to form the backbone of tight junctions.
“Computational simulations allow investigators to observe protein dynamics and atomic resolution with resources like Frontera,” McGuinness said.
Protein Sequencing
Behzad Mehrafrooz, a PhD student at the Center for Biophysics and Quantitative Biology, University of Illinois at Urbana-Champaign, outlined his group’s latest work extending the reach of nanopores to sequence entire proteins, which are much larger and more complex than DNA.
“Thanks to Frontera, it was one of the longest, if not the longest molecular dynamics simulations for nanopore sequencing yet made,” Mehrafrooz said. “And it confirmed the rapid, unidirectional translocation induced by guanidinium chloride and helped unravel the molecular mechanism behind it.”
Viral Packaging
Kush Coshic, a PhD student in the Aksimentiev Lab at the University of Illinois at Urbana-Champaign, described simulations that took more than four months to perform using Frontera’s GPU nodes to simulate the genomic packaging of a model herpes-like virus, applicable to developing new therapeutics. “Frontera enables us to perform unprecedented high throughput analysis of a 27 million atom system,” Coshic said.
Spectral Function
“We’ve developed a new algorithm for calculating spectral functions with continuous momentum resolution that complements existing many-body techniques,” said Edwin Huang, an assistant professor in the Department of Physics & Astronomy at Notre Dame University.
His team’s determinantal quantum Monte Carlo solver for computing the spectral function of fermionic models with local interactions required sampling over a billion state configurations on Frontera.
Path to Horizon
Planning is underway for a massive new system as part of the NSF-funded Leadership Class Computing Facility (LCCF), with a projected 10X the capabilities of Frontera. Early users of the new system, called Horizon, can expect it to start in the second half of 2025 and enter full production in 2026.
“There are still opportunities to talk about what goes into Horizon,” Stanzione said. “One of the points of this meeting is to continue requirement gathering.”
To unlock the potential of Horizon, the future system will need to provide robust support for both CPU- and GPU-based codes. Software performance directions being explored are mixed precision matrix operations in GPUs, which can offer a 30X advantage in performance over single precision vector units.
“Software enables science and it will drive our decisions about future systems. The most important thing for TACC is that we get to hear from users about what is working with Frontera, what can be improved, and what needs to change to meet their needs in future systems,” Stanzione concluded.
Source: TACC
You can offer your link to a page which is relevant to the topic of this post.
#000#2023#3d#Administration#aerospace#algorithm#amazing#amp#Analysis#applications#Astronomy#atmosphere#atom#atomic#billion#Biology#biophysics#black hole#box#Brain#brain activity#brain development#bubbles#Cells#Children#clouds#Collaboration#Community#computation#computational science
0 notes
Text
I've been taking a lot of classes on digital archaeology and cyberinfrastructures for archaeology, and while we do talk a little bit about challenges of sustainability (keeping those infrastructures running and the data accessible into the future), this has been nagging me for a while, especially given just how many case studies have come up of AWESOME resources and databases that are just completely defunct or unusable now because no one planned ahead for what happens when file formats change or the technologies they're based off cease to be supported. Digital storage is AWESOME and presents a lot of great opportunities, but we need to stop pretending it's invincible and doesn't face preservation and archival challenges just like analog media.
its so fucked up that optical discs straight up rot though right? something about digital media just feels like it shouldnt be susceptible like that to the forces that govern the physical world and yet discs rot as if theyre an organic thing
#digital preservation#digital archaeology#curation#cyberinfrastructure#archaeology#cybernetics#But like in the literal technical sense not the cool cyberpunk transhumanism sense
20K notes
·
View notes
Text
6 impressive ways how coding helps the environment
Clean water, clear air, healthy ecosystems are the foundation of human society. Issues like air pollution, water contamination, endangered wildlife impact this foundation and jeopardize our comfortable lifestyle.
Ever since 1974 the world has celebrated Environment Day every year on 5 June. The day reminds us of these environmental issues and allows us to improve our habits and behaviour and focus our efforts on making a positive change.
There are, however, lots of people who are striving to reverse these processes, and computer science is one of the solutions. In this list, you’ll see how coding can help preserve nature and biodiversity – this year’s theme for World Environment Day.

#1 The Internet
Let’s start with a very basic one. Technology allows us to work remotely which can reduce air pollution in cities. It also saves office space. Video conference programs like Zoom, Teams, Hangouts reduce the need to commute for meetings resulting in reduced fuel consumption.
Using email and messenger services means printing less and reducing the production of paper.
Naturally, there are some drawbacks like technological waste. That’s why the EU is creating laws for improving recycling processes, energy consumption and service life of electrical devices.
#2 Ocean pollution
Plastic pollution in the oceans has been a critical topic for years. The problem has become so severe that companies like LADBible have launched campaigns to have ‘Trash Isles’ recognised by the UN as a separate country.
Source: treehugger.com
Initiatives like Plastic Adrift have been able to create statistical models to track the paths of these isles of plastic trash and eventually identify their source.
“Since the late 1970s, ocean scientists have tracked drifting buoys, but it wasn't until 1982 the World Climate Research Programme put forward the idea of a standardised global array of drifting buoys. These buoys float with the currents just like plastics except - like Twitter from the sea - they send a short message to scientists every few hours about where they are and the conditions in that location.”
The data from this research is available to everyone on the website of Plastic Adrift.
Source: plasticadrift.org
Parents and educators can teach children about the impact of plastic pollution and instil a responsible lifestyle in them through coding. Vidcode offers a project, called ‘End Plastic Pollution’ which teaches kids about coding and raises awareness of the environmental issue. The project can be found here.
#3 Freshwater supply
Freshwater is an invaluable but scarce resource, especially in some parts of the world. UN Environment, Google and the European Commission have launched a data platform to track the world’s water bodies. The app enables all countries to monitor their freshwater supply.
The Data and interactive map are available here. Explore!
#4 Forest health
There’s nothing quite like walking on a forest path and taking in the fresh air. However, forests are threatened by many factors like climate change, drought or changes in temperature. That’s why scientists use geographic information systems to collect and analyse relevant data to help preserve forests.
“Some people fear working with cyberinfrastructure because of the presumed complexities of learning to code,” says Tyson Swetnam, a science informatician who recently led a research project on forest biomass data analysis. As such, learning code from a tender age is a necessary skill to make a positive change in the environment. Read more about the project on sciencenod.org.
#5 Wildlife corridors
The human population grows every day and so do the areas occupied by us leaving less space for wild animals. A way to reduce the impact of urbanisation is to create wildlife corridors – areas of protected land where animals are safe. So, what scientists do is compile massive amounts of data to create models of the areas inhabited by wildlife. This way, applying a computational approach, they can predict where those areas are and can determine how to design the corridors.
Platforms like Scratch offer great resources for children to make their firs steps into coding while learning about wildlife.
#6 Protecting habitats
The Jane Goodall Institute combats the same issue using coding and computer science to protect primates and their habitats. When large forest areas are cleared to develop human infrastructure, the forest patches which are left are often not enough to support larger populations. That’s where remote sensing technologies come into play. They enable the use of information, collected by satellites, to monitor chimpanzee habitats. This way, the institute can use the information to protect great ape habitats in numerous countries.

Computer science enables researchers and scientists to use large-scale data and investigate and analyse issues like climate change and water contamination. And through this research, people can positively impact the environment.
If you know of other examples of how to use coding for good or you yourself have ideas on how to use computer science to protect nature, share them with us. Connect with us on Twitter and Instagram.
1 note
·
View note
Text
Supersized fruit eater database on climate change frontline
In this month’s open-access journal Global Ecology and Biogeography, the group introduces for the first time a hulking list of more than 45,000 traits for creatures that eat fruit. Frugivoria, named for the species called frugivores who survive mainly on fruit, supersizes existing databases by providing researchers and conservationists with one-stop listings of both critters and birds in the forests of Central and South America. Frugivoria’s data and workflow are open and accessible to all to help facilitate its use for addressing the biodiversity crisis. In a time of rapid climate change, it’s crucial to understand how the fruit eaters are doing in specific ways. “With climate change, seed dispersion is really important,” said Beth Gerstner, a PhD candidate in the MSU Department of Fisheries and Wildlife who led the development. “Fruit eaters maintain forest composition and health by pooping — which spreads seeds. Frugivoria is an important contribution because researchers can use this to understand the diversity of their roles in the ecosystem.” Knowing what is doing the fruit eating and pooping, as well as their distribution and life traits — their life expectancies, breeding habits, habitat preferences — is critical to tracking changes that climate change may bring. Yet current databases were fragmented or incomplete. Starting in 2018 at MSU, 12 undergraduate students were tasked with sleuthing through of mounds of scientific publications to flesh out existing records of fruit eaters, adding birds for a more holistic understanding of the forests. Most exciting, Gerstner said, was entering 44 new species, like the olinguito. That’s a member of the same family as racoons that lives in the cloud forests of the northern Andes, and one that Gerstner studies. The olinguito had been mistaken for the larger olingo, but upon being discovered in 2013 has been found to indeed be genetically different. “Natural history is entering the age of big data,” said Phoebe Zarnetske, associate professor in integrative biology and director, Institute for Biodiversity, Ecology, Evolution, and Macrosystems (IBEEM). “Through Frugivoria, we are contributing to increasing the accessibility of natural history information traditionally found in museums and collections. This project provided a unique opportunity to engage numerous undergraduates in research with data science and functional ecology. Zarnetske said Frugivoria can help with both basic and applied questions about species’ functions in their environment. It can be used by community scientists to learn more about species’ natural history, and it can aid in species conservation assessments “As a result,” she said, “Frugivoria is part of something bigger — we can leverage the power of its big data to help solve the biodiversity crises.” Getting Frugivoria out where it’s needed is Gerstner’s goal. “My hope,” she said, “is for the database to be used by the International Union for the Conservation of Nature and people doing on-the-ground conservation.” Both Gerstner and Zarnetske are members of MSU’s Ecology, Evolution, and Behavior Program and Spatial and Community Ecology (SpaCE) Lab The work behind “Frugivoria: A trait database for birds and mammals exhibiting frugivory across contiguous Neotropical moist forests” was supported by a NASA Future Investigators in NASA Earth and Space Science and Technology, a National Science Foundation Campus Cyberinfrastructure program and computational resources and services provided by the Institute for Cyber-Enabled Research of which co-author Patrick Bills is a member. In addition to the open access paper in Global Ecology and Biogeography, the database itself is published open access with the Environmental Data Initiative.
0 notes
Text
Task force proposes new federal AI research outfit with $2.6B in funding
The final report from the government’s National AI Research Resource recommends a new, multi-billion-dollar research organization to improve the capabilities and accessibility of the field to U.S. scientists. The document presents “a roadmap and implementation plan for a national cyberinfrastructure aimed at overcoming the access divide, reaping the benefits of greater brainpower and more diverse […] Task force proposes new federal AI research outfit with $2.6B in funding by Devin Coldewey originally published on TechCrunch http://dlvr.it/ShNgBk
0 notes
Text
Smartsynchronize command line args

Accordingly, it will enhance wind community's analysis and design capabilities to address next generation challenges posed by wind. This framework would enable a paradigm shift by offering advanced cyber-enabled modules (e-modules) for the acceleration of advances in research and education to achieve improved understanding and better modeling of wind effects on structures. To address the needs, this paper proposes a new paradigm of a cyber-based framework for analysis/design, modeling and simulation of wind load effects based on a cyberinfrastructure, Virtual Organization for Reducing the Toll of EXtreme Winds (VORTEX-Winds) at. With the trend toward increasingly complex designs on civil structures and the escalating potential for losses by extreme wind events, a new culture of research needs to be established based on innovative solutions for better management of the impact of extreme wind events, which also requires to challenge the interdisciplinary nature of wind effects on structures. This study introduces the system and its installation during the construction of Burj Dubai (Burj Khalifa).ĭespite many advances in the area of wind effects on structures in recent decades, research has been traditionally conducted with limited resources scattered physically. Within this framework, data streams from distributed sensors are pushed through network interfaces in real-time and are seamlessly synchronized and integrated by a centralized server, which performs the functions of basic data acquisition, event triggering and data management and processing, while at the same time providing a powerful interface for data visualization. Instead this system offers a self-contained, "plug-and-play" module for scalable and rapidly deployable monitoring. Given the reliability of modern networks and the comparatively low sampling rates required for structural monitoring, the issues of packet loss and synchronization often experienced in wireless systems are eliminated, as is the need for lengthy instrumentation cables that add to the cost and noise in wired systems. This study introduces the SmartSync monitoring system, which utilizes the building's existing network as "virtual" instrumentation cables.

0 notes
Text
Application of the Virtual Information Fabric Infrastructure (VIFI) to Building Performance Simulations
Abstract
Although contemporary information and computer technologies enable researchers, designers, and engineers to collect large amounts of data, current approaches for sharing data remain limited by concerns such as interoperability, shareability, data size, transport costs, and privacy. These concerns often prevent or confound the development of reliable and accurate building performance simulations. In an effort to overcome these concerns, we envisioned a complementary approach, called the Virtual Information Fabric Infrastructure (VIFI) approach, where data owners share distributed, fragmented data in a manner that does not require the movement of raw data and data users utilize these data for analysis by transporting analytic computation to the data. The VIFI approach presents new opportunities to conduct building performance simulations. Through two case studies, we demonstrated a new computational framework based on VIFI to support an open and collaborative cyberinfrastructure for building performance simulations. Such a computational framework represents a system view toward building performance simulations, in which essential components of simulations are coherently integrated.
Read More about this article: https://irispublishers.com/ctcse/fulltext/application-of-the-virtual-information-fabric-infrastructure-vifi-to-building-performance.ID.000585.php
Read More about Iris Publishers Google scholar Articles: https://scholar.google.com/citations?view_op=view_citation&hl=en&user=LoZ6uCQAAAAJ&citation_for_view=LoZ6uCQAAAAJ:fPk4N6BV_jEC
1 note
·
View note
Text
Vista: A New AI-Focused Supercomputer for the Open Science Community - Technology Org
New Post has been published on https://thedigitalinsider.com/vista-a-new-ai-focused-supercomputer-for-the-open-science-community-technology-org/
Vista: A New AI-Focused Supercomputer for the Open Science Community - Technology Org
Vista, a new artificial intelligence (AI)-centric system, is arriving at the Texas Advanced Computing Center at The University of Texas at Austin in early 2024.
Vista will set the stage for TACC’s Horizon system, the forthcoming Leadership-Class Computing Facility (LCCF) funded by the National Science Foundation (NSF), planned for fiscal year 2025. Horizon is expected to provide 10 times the computing capability of Frontera, the top U.S. academic supercomputer and the largest supercomputer in the NSF research cyberinfrastructure.
“Vista will bridge the gap between Frontera and Horizon to ensure the broad science and engineering research and education community has access to the most advanced computing and AI technologies,” said Katie Antypas, director in the NSF Office of Advanced Cyberinfrastructure. “Vista will also be a critical new resource to support responsible and trustworthy AI research for the benefit of our national welfare.”
Vista will mark a departure from the x86-based architecture used by TACC in Frontera, the Stampede systems, and others to central processing units (CPU) based on the Advanced RISC Machines (Arm) architecture. The new Arm-based NVIDIA GraceCPU Superchip is specifically designed for the rapidly expanding needs of AI and scientific computing.
“We’re excited about Vista,” said TACC Executive Director Dan Stanzione. “It’s our first ever system with an Arm-based primary processor. It will add to our capacity, particularly for AI, and help our user base begin porting to future generations of these technologies. With Vista, alongside our new Stampede3 (Intel) system, and the Lonestar6 (AMD) system we added last year, our team and our users will gain experience with and insight into the three major architectural paths we might follow for future systems, including Horizon.”
The NVIDIA GH200 Grace Hopper Superchip will be the processor for a little more than half of Vista’s compute nodes. It combines the Grace CPU with an NVIDIA Hopper architecture-based GPU so that the GPU can seamlessly access CPU memory to enable bigger AI models. The NVIDIA Grace CPU Superchip, which contains two Grace processors in a single module, will fill out the remainder of Vista’s nodes for unaccelerated codes.
Memory is implemented in a new way with the superchips. Instead of traditional DDR DRAM, the Grace uses LPDDR5 technology—like the memory used in laptops but optimized for the needs of the data center. In addition to delivering higher bandwidth, this memory is more power-efficient than traditional DIMMS, offering savings as great as 200 watts per node.
In addition, the NVIDIA Quantum-2 InfiniBandnetworking platform will help advance Vista’s performance with its advanced acceleration engines and in-network computing, propelling it up to 400Gb/s.
“AI has the potential to allow scientific computing to solve some of the most challenging problems facing humanity,” said NVIDIA Director of Accelerated Computing Dion Harris. “NVIDIA’s accelerated computing platform equips leading academic supercomputers, such as TACC’s Vista, with the extreme performance required to unlock this transformative potential.”
On the storage side, TACC has partnered with VAST Data to supply Vista’s file system with all-flash, high-performance storage linked to its Stampede3 supercomputer. The compute nodes will be manufactured by Gigabyte, and Dell Technologies will provide the integration.
Vista allocations will be available primarily through the NSF-funded Frontera project, and will also offer time throughthe Advanced Cyberinfrastructure Ecosystem: Services and Support (ACCESS) project to its broad user community.
Stampede3 to Enter Full Production in Early 2024
In addition to Vista, TACC announced the Stampede3 system in July 2023, a powerful new Dell Technologies and Intel-based supercomputer that will be the high-capability and high-capacity HPC system available to open science research projects in the U.S. when it enters full production in early 2024.Learn more about the system specifications.
Lonestar6: TACC’s Primary System for Texas Researchers
TACC’s Lonestar6 supercomputer went into full production in January 2022 with a boost of new servers and GPUs from Dell Technologies, AMD, and NVIDIA. This is in addition to the three petaflops of pre-existing performance from the AMD CPUs in the system. This system allows Texas researchers to compute and compete at the forefront of science and engineering. Lonestar6 is designed to meet the growing demand for AI and other GPU-accelerated solutions and take advantage of the power efficiency in heterogeneous computing.Learn more about the system specifications.
Source: TACC
You can offer your link to a page which is relevant to the topic of this post.
#2022#2023#2024#A.I. & Neural Networks news#accelerated computing#ai#amd#architecture#arm#artificial#Artificial Intelligence#artificial intelligence (AI)#bridge#Community#computing#cpu#cybersecurity#data#Data Center#dell#dram#education#efficiency#engineering#engines#flash#Foundation#Full#Future#gap
0 notes
Text
The Top Ten New Technologies for Smart Healthcare Use Cases in 2022 Including Remote Patient Monitoring and 5G-Enabled Devices
According to Andrew Semple Bradenton, while most of us are familiar with remote patient monitoring, this new technology has the potential to be a game changer. Medical professionals can monitor their patients remotely from the comfort of their own homes using remote patient monitoring technologies. Remote patient monitoring enables healthcare facilities to give more effective treatment to patients due to the advancement of wireless technology. Slow networks, on the other hand, can impair medical practitioners' capacity to monitor patients or collect important health care data. 5G-enabled devices can significantly reduce the likelihood of experiencing a slow network. Additionally, this technology can assist health care providers in providing more seamless therapy to their patients.
While this technology may not be ready for widespread usage in the near future, it is already being tested. Remote patient monitoring and autonomous vehicle control are two of the first applications that potentially benefit from this technology. Meanwhile, other technology, such as 5G-enabled cellphones, may enable remote surgery, allowing physicians to control medical robots remotely. As we will witness in the coming years, 5G will fundamentally alter our way of life.
The health care sector creates enormous volumes of data daily – a single patient can generate hundreds of gigabytes. Medical instruments such as MRIs, CAT scans, and PET scans generate massive amounts of visual data. Additionally, the 5G network speeds up the process of transferring such huge data across legacy wired networks. As a result, this new technology will enable hospitals to give remote health treatments to patients globally.
With these new advancements, healthcare providers can give high-quality care without physically visiting their patients, allowing physicians to monitor patients remotely. These new technologies have a number of advantages, including faster reaction times, less travel time, and improved patient outcomes. Additionally, this technology is more cheap than alternative methods, and it can even assist in the delivery of whole surgical operations.
Andrew Semple Bradenton pointed out that telehealth and remote patient monitoring pave the way for a bright future for healthcare facilities. Rapid technological advancements - augmented and virtual reality, artificial intelligence, and smart contracts - will have a substantial impact on the medical profession. By gathering this data, healthcare practitioners can gain a deeper understanding of their patients and provide them with better care. However, this technology is not without its risks and opportunities.
Telesurgery will become ultrareliable with 5G connectivity. 5G offers a carrier-grade dependability of seven nines, or a few microseconds every day. This technique will aid in the performance of delicate operations such as spinal surgery. Due to its low latency, surgeons may be able to perform robotic-assisted surgery remotely, eliminating the need for several surgical teams.
Additionally, the future of connected gadgets is bright. The Internet of Things is a burgeoning phenomenon, and 5G is on the verge of bringing it to life. The use of 5G will enable remote control of entire factories, hence increasing efficiency. Connected gadgets will have a greater influence than ever before with a quicker network. Although these technologies are already having an impact on a variety of industries, their general acceptance will be greatest in the transportation sector.
Cyberinfrastructure is becoming increasingly resilient in the era of Industry 4.0, in order to support IoMT applications. Monitoring and diagnosing patients is merely a matter of time with 5G. And, with their vast bandwidth, these new technologies have the potential to change healthcare. The development of connected medical infrastructure is advancing the globe toward a truly intelligent future.
In Andrew Semple Bradenton’s opinion, virtual and augmented reality headsets are already transforming how individuals use remote patient monitoring. These technologies can assist visually impaired individuals with daily duties. Through a 5G headset, they can communicate with a live advisor. The additional bandwidth significantly improves the quality of streaming video. Additionally, it enhances the patient experience. Additionally, 5G headsets can give patients and healthcare professionals with a high-quality video experience.
0 notes
Photo

Weapons trained on the Enemy. According to an April 5 report in Bloomberg Government, DHS was searching for a contractor to help it monitor more than 290,000 global news sources in over 100 languages, including Arabic, Chinese and Russian, all of which will be translated to English in real time. These outlets would include newspapers and magazines, television and radio, podcasts and social media. “The DHS request says the selected vendor will set up an online ‘media influence database’ giving users the ability to browse based on location, beat, and type of influence,” Bloomberg’s Cary O’Reilly reveals. The database would include, “[f]or each influencer found, present contact details and any other information that could be relevant, including publications this influencer writes for, and an overview of the previous coverage published by the media influencer.” If the project sounds like a First Amendment violation waiting to happen, that’s because it is. While DHS insists that the database will “protect and enhance the resilience of the nation’s physical and cyberinfrastructure,” perhaps against foreign interference in future elections, the potential for censorship and other abuses of power is virtually limitless. The DHS’ latest venture reveals where disdain for journalists can lead, and the extent to which it can be weaponized.
5 notes
·
View notes