#ibmquantum
Explore tagged Tumblr posts
govindhtech · 5 days ago
Text
OrangeQS Secures Record-Breaking €12M Seed Funding
Tumblr media
After its €12 million seed fundraising round was oversubscribed, Delft-based Orange Quantum Systems (OrangeQS) has garnered attention. The Netherlands'  quantum computing business received its largest seed funding round with this investment. OrangeQS aims to overcome a major but often overlooked issue in quantum computing: the laborious, expensive, and talent-intensive testing of quantum devices.
Continuing Quantum Chip Testing Challenge
Quantum computers have great potential, but verifying their quantum processors has slowed their development. Quantum devices cannot be mass-tested at ambient temperature like semiconductor chips. They require special settings and methods. This complexity stems from quantum bits, or qubits, which, unlike classical bits, can exist in a superposition of states (both 1 and 0).
Testing quantum chips presents challenges:
Isolating and insulating quantum chips requires high vacuum, extremely low temperatures, and accurate low-power microwave electromagnetic signals.
Resource Drain: Manufacturers spend 30–50% of their R&D teams developing, building, and maintaining internal test sets since testing is so difficult. Due to their high cost and scarcity, these specialists spend less time building quantum chips, computer systems, and algorithms.
Quantum chip testing used to take weeks, making quick iterations and advancements in semiconductor creation challenging. OrangeQS CEO Garrelt Alberts says this slow pace slows iteration and raises testing costs. Many companies test quantum computers on their own, wasting money and slowing progress.
The “quantum testing bottleneck” impedes the development and building of quantum computers by limiting capacity and disrupting present processors.
Product Suite and Innovative Solutions from OrangeQS
OrangeQS, a 2020 spin-off from QuTech and TNO, aims to eliminate this bottleneck and revolutionise quantum computing. Instead of increasing testing capacity, OrangeQS develops tools tailored for “test-time per qubit”. Their strategy reduces testing time from weeks to days to free up talent, money, and time for quantum development.
OrangeQS offers a full array of tools to accelerate quantum chip testing across the value chain:
OrangeQS MAX: This flagship equipment set industry norms for high-volume, standardised quantum-chip testing. It dramatically reduces qubit testing time by assessing quantum processors faster. The top European quantum-computer maker, IQM, will deploy the OrangeQS MAX system in Espoo, Finland, to speed up quantum chip development.
OrangeQS Flex: Industrial and academic R&D teams can customise chip testing with this equipment. It is used by quantum research facilities like the University of Napoli and Karlsruhe Institute of Technology.
OrangeQS Juice: This open-source operating system simplifies quantum research apparatus control. This operating system is being tested by QuTech (Netherlands), Chalmers Next Labs (Sweden), and Berkeley Lab's Advanced Quantum Testbed (USA).
According to Garrelt Alberts, OrangeQS MAX aims to cut qubit test time in half every two years. This continuous testing process upgrade aims to reduce qubit test time by several orders of magnitude. Interestingly, OrangeQS is the only company offering a reliable, fast, and affordable turnkey quantum testing solution.
Impact of the €12 Million Seed Funding Round
The highly oversubscribed €12 million seed round shows OrangeQS's technology's importance to investors. Icecat Capital led the investment, which included QBeat Ventures and InnovationQuarter Capital. Pre-seed investors QDNL Participations and Cottonwood Technology Fund continued to support.
New funds will be used for strategic investments:
Accelerate scalable quantum chip testing tools.
Create faster testing devices to analyse quantum chips in days rather than weeks.
Promote Orange Quantum Systems' globalisation.
The “Quantum Equivalent of Moore’s Law” is being created. OrangeQS is vital to quantum computing manufacturers' efforts to build the first practical quantum computers. To create the first fault-tolerant quantum computer by 2029, companies like IBM Quantum must iterate quickly and test. OrangeQS wants to accelerate testing to enable useable quantum computing and possibly develop a “quantum equivalent of Moore’s Law”.
Moore's Law states that a microchip's transistor count doubles every two years, resulting in exponential processing power and cost reductions. OrangeQS's objective to help leading chip manufacturers double the number of reliable quantum bits (qubits) every few years illustrates a future of constant, predictable quantum computing advancement. This development relies on OrangeQS innovations that simplify and scale up quantum chip testing and validation.
OrangeQS supports the move of quantum chip research “from lab to fab,” from academia to industry. OrangeQS will offer high-throughput test solutions when the chip industry upgrades its quantum facilities, according to Garrelt Alberts. OrangeQS accelerates testing, helping quantum computers become practical.
0 notes
fraoula1 · 4 months ago
Text
Quantum Computing for Everyday Use: How It Will Change Your Life!
Quantum computing is no longer just a futuristic concept—it’s becoming a reality! 🌍🚀 From revolutionizing drug discovery and cryptography to optimizing supply chains and enhancing artificial intelligence, quantum computers are set to transform industries and impact our daily lives. In this video, we explore how companies like IBM and Google are bringing quantum technology closer to mainstream adoption. 🔬 Discover how quantum computing is solving real-world problems 🔑 Learn about the impact on cybersecurity and financial markets 🌎 See how quantum tech is helping in climate modeling and AI Don’t forget to LIKE 👍, SUBSCRIBE 🔔, and SHARE 📢 to stay updated on the latest tech innovations!
youtube
0 notes
aptcode-blog · 4 months ago
Link
0 notes
summertryouts · 6 years ago
Photo
Tumblr media
The future of computers is here! “The IBM quantum computer!”... And I still feel like music software plug-ins will take this down to its knees! #summertryouts #ibmquantum #quantum #quantumcomputing #quantumcomputer #electronic #undergroundmusic #dreamwave #housemusic #daw #vaporwaves #techno #aesthetic #cyberpunk #cyberwave #spotify #spotifyplaylist #art #neonlights #neon #electronicmusicartist #tokyo #photooftheday #lofi #aesthetictumblr #soundcloud #chill #chillmusic #chilloutmusic https://www.instagram.com/p/BsYF7jlHdMY/?utm_source=ig_tumblr_share&igshid=m6ryox3jq88d
0 notes
govindhtech · 6 days ago
Text
VarQEC With ML Improves Quantum Trust By Noise Qubits
Tumblr media
Variational Quantum Error Correction (VarQEC) is a novel method for resource-efficient codes and quantum device performance that optimises encoding circuits for noise characteristics using  machine learning (ML) . Error mitigation approaches for near-term quantum computers are advanced by this approach.
Problems with Quantum Errors and QEC
Fragile quantum systems limit quantum computing, despite its revolutionary computational potential. Qubits—quantum information building blocks—are prone to decoherence, quantum noise, and gate defects. Without sufficient corrective mechanisms, quantum computations quickly become unreliable.
Quantum system mistakes can take many forms:
A qubit can switch between zero and one.
Phase-flip mistakes occur when a qubit's quantum state phase changes unexpectedly.
Gate errors are caused by quantum gates (devices used to manipulate qubits, including lasers or magnetic fields) malfunctioning.
To solve these issues, typical QEC methods like Shor's code and surface codes encode logical qubits across several physical qubits. These methods have large resource requirements (surface codes require thousands of physical qubits for a single logical qubit), complicated decoding techniques, and poor adaptation to real-world quantum noise. This high overhead hinders realistic quantum computation.
VarQEC: Machine Learning-Based Approach
Due of these limits, scientists are exploring more flexible and resource-efficient methods. VarQEC uses machine learning and AI to support quantum computing. The inverse AI supporting quantum computing is becoming important for real-world use, despite the focus on how quantum computing enhances AI. The article “Learning Encodings by Maximising State Distinguishability: Variational Quantum Error Correction” by Andreas Maier from Friedrich-Alexander-Universität Erlangen-Nürnberg and Nico Meyer, Christopher Mutschler, and Daniel Scherer from Fraunhofer IIS introduced VarQEC.
Key VarQEC Features:
VarQEC uses a new machine learning goal called the “distinguishability loss function.” This function is the training objective by testing the error correction code's ability to differentiate the target quantum state from noise-tainted states. VarQEC maximises this distinguishability, making encoding circuits more resilient to device-specific errors.
Encoding Circuit Optimisation: VarQEC optimises encoding circuits for device-specific errors and resource efficiency. Unlike static, pre-defined codes, error correction can be tailored to each quantum device. Flexibility is needed because quantum systems are dynamic and error rates and types vary owing to hardware and environmental changes.
Practical Application and Performance Gains: The study revealed how VarQEC can maintain quantum information on actual and simulated quantum hardware. Experiments learnt error correcting codes to adapt to IBM Quantum and IQM superconducting qubit systems' noise attributes. These efforts led to persistent performance gains over uncorrected quantum states in specific ‘patches’ of the error landscape. Successful hardware deployment proves machine learning-driven error prevention strategies.
Hardware-Specific Adaptability: The study stressed the importance of matching error correcting code design to hardware architecture and noise profiles. In connectivity experiments on IQM devices, star and square ansatz topologies performed similarly, suggesting that topology may not always affect efficacy. Still, the discovery of a faulty qubit on an IQM device showed how sensitive codes are to qubit performance and how important calibration is.
The Broader AI for QEC Landscape With VarQEC
VarQEC shows how AI, specifically machine learning, may improve QEC.
To decode lattice-based codes like surface codes, Convolutional Neural Networks (CNNs) can find error patterns faster and utilise less computing power. For surface code decoding, Google Quantum AI uses neural networks to rectify errors faster and more accurately.
Enhancing Robustness and Adaptability: Reinforcement Learning (RL) approaches can instantaneously adjust error correction plans to changing error types and rates. Supervisory machine learning models like recurrent neural networks can handle time-dependent error patterns like non-Markovian noise. IBM researchers found and fixed failure patterns using machine learning (ML).
Generative models like Variational Autoencoders (VAEs) and RNNs can capture complex error dynamics like non-Pauli errors and non-Markovian noise, improving prediction accuracy and proactive maintenance.
Even though QEC encodes information with many qubits and mathematically restores corrupted states to discover and rectify defects, QEC and QEM must be distinguished. QEM reduces mistakes and their effects by using statistical methods to get the best result from noisy data or improving hardware stability. As its name implies, VarQEC corrects undesirable results immediately.
VarQEC's Future and Challenges
Despite promising results, VarQEC and AI in QEC confront many challenges:
Future VarQEC work should focus on adding increasingly complicated, device-specific noise models into the training process to account for correlated noise and qubit-specific oscillations. The assumption of uniform noise levels will be exceeded.
Scalability: Testing VarQEC on larger qubit systems and more complex quantum circuits is the next step in determining its suitability for harder algorithms. This is consistent with the larger issue of improving machine learning models to handle more qubits without increasing processing load.
Alternative Designs: VarQEC may increase performance by testing other ansatz designs and optimisation methodologies.
AI in QEC has challenges such data scarcity and integration due to the absence of quantum error datasets for ML model training, which requires data augmentation. To smoothly integrate AI-driven QEC into quantum computing platforms, physicists, computer scientists, and engineers must study hardware-software co-design and interdisciplinary collaboration.
In conclusion
VarQEC is a promising machine learning-based quantum computing failure solution. Customising error correction codes to quantum hardware noise helps make fault-tolerant and useful quantum systems conceivable.
0 notes
govindhtech · 2 months ago
Text
Quantum Reservoir Computing QRC For Soft Robot Control
Tumblr media
Use quantum reservoir computing to explore quantum machine learning's frontiers. Keio University and Mitsubishi Chemical used QRC to study and forecast flexible soft robot behaviour.
Quantum Innovation Centres IBM
In 2017, Keio University became one of the first IBM Quantum Hubs, now QICs. There are about 40 QICs globally. QICs use IBM Quantum's expertise to advance quantum computing. These global hubs promote a worldwide quantum ecosystem, creative quantum research, and quantum learning communities by drawing participants to joint research initiatives.
Keio University works with leading Japanese companies to develop quantum applications and algorithms as a QIC. The university's partnership with Mitsubishi Chemical, a global materials science research and development leader, is an example. In 2023, scholars from the two organisations, the University of Tokyo, the University of Arizona, and the University of New South Wales conducted a utility-scale experiment utilising an IBM Quantum device to execute a proposed quantum reservoir computing technology. This investigation established a thriving research endeavour.
Reservoir computing with utility-scale quantum computation
Reservoir computing (RC) reduces the training overhead of neural networks and generative adversarial networks. A reservoir is a computing resource that can conduct mathematical transformations on incoming system data, allowing large datasets to be manipulated while keeping data point connections.
Researchers send input system data to the reservoir in a reservoir computing experiment. Researchers will use post-processing to find answers in the reservoir's changed data. This post-processing often uses the linear regression model, an ML model for variable relationships. After training a linear regression model using reservoir output data, researchers can construct a time series that predicts input system behaviour.
Quantum reservoir computing (QRC) uses quantum computers as its reservoir. Quantum computers, which may surpass standard systems in computing capability, are ideal for high-dimensional data processing.
Mitsubishi Chemical, Keio University, and others are studying how quantum reservoir computing might help comprehend complicated natural systems. Their 2023 experiment aimed to create a quantum reservoir computing model that could predict the noisy, non-linear motions of a "soft robot," a malleable device controlled by air pressure.
Creating Quantum Reservoir Computing techniques
The researchers converted robot movement data into IBM quantum reservoir-readable quantum input states to begin the experiment. These inputs reached the reservoir. After applying random gates to input states, the reservoir produces changed signals. After that, researchers post-process output data with linear regression. The result is a robot movement prediction time series. The researchers evaluate this prediction against real data to determine its accuracy.
Most quantum reservoir computing systems measure at the end of a quantum circuit, therefore you must build up and run the system for every qubit at every time step. This can increase experiment duration and reduce time series accuracy. The Keio University and Mitsubishi Chemical research sought to overcome these limitations with “repeated measurements”.
They add qubits and measure them repeatedly instead of setting up and executing the system at each time step. This method allows researchers to collect the time series at once, resulting in a more accurate series and less circuit time.
The researchers demonstrated their quantum reservoir computing system using IBM Quantum processors with up to 120 qubits. They found that repeated measurements yielded higher accuracy and faster execution than standard QRC methods. Their first studies suggest it might accelerate calculating.
Before RC and quantum reservoir computing can solve problems, additional research is needed. The researchers say their utility-scale investigations may outperform standard modelling methods. They plan to study quantum reservoir computing for nonlinear problems like financial risk modelling.
How Quantum Innovation Centres Help Enterprise Research Organisations
Keio University and Mitsubishi Chemical's relationship is an example of how businesses may benefit from IBM Quantum Innovation Centre partnerships. Professors and students who are strong in quantum computing and at teaching other researchers in difficult issues may assist enterprise researchers achieve advanced quantum skills through these relationships.
Not just Mitsubishi Chemical, but also other global firms are benefiting from this. In addition to Mistubishi Chemical, Keio University is collaborating with corporate R&D teams from leading companies in many industries and quantum use cases to investigate exciting quantum applications and algorithm development. These collaborations show how industry research trials with universities may lead to valuable real-world applications and how QICs can help corporations explore fascinating quantum use cases.
0 notes
govindhtech · 11 months ago
Text
IBM Quantum Information Science and Fermilab’s SQMS Centre
Tumblr media
Fermilab SQMS
IBM Plans to Advance Critical Quantum Information Science Initiatives in Collaboration with Fermilab’s SQMS Centre.
In order to extend quantum workforce development programmes and significantly advance crucial technologies and applications of superconducting quantum systems, IBM intends to join Fermilab’s SQMS Centre.
SQMS Meaning
The U.S. Department of Energy Office of Science, Science Programmes, has authorized IBM’s entry as a new partner in the Superconducting Quantum Materials and Systems Centre, a DOE National Quantum Information Science Research Centre, hosted by Fermilab. SQMS Centre is committed to developing key quantum technologies, with a focus on superconducting quantum systems, as a prominent national and worldwide research centre.
IBM is pioneering superconducting quantum computing. These two organizations hope to tackle major challenges in quantum computing, communication, and superconducting quantum platform use by working together.
Huge cryogenics
IBM and SQMS plan to collaborate on the development of technologies that are essential for the large-scale data centre scaling up of quantum computing. At Fermilab, SQMS is already putting forth innovative ideas for larger-scale, more efficient milli-Kelvin cryogenics. The largest dilution refrigerator in the world, dubbed “Colossus,” will house 3D superconducting radiofrequency (SRF)-based quantum computing and sensor devices.
IBM is going to supply useful data and guidelines to increase the reach of Colossus. Creating a large-scale cooling system based on LHe/N2 plants is one aspect of this, as it would be appropriate for IBM’s upcoming large-scale commercial quantum computing systems.
Superior and abundant quantum interconnects
High-density, high-quality quantum interconnects are being designed and prototyped by SQMS using 3D SRF platforms for quantum computing systems under development at Fermilab. These advancements can also be applied to the expansion of modular systems based on chips. With an emphasis on premium microwave cables, Fermilab and IBM want to investigate the viability and use of quantum links as a component of a commercial quantum system.
Noise reduction in processors and qubits
IBM and SQMS partners plan to collaborate as part of the SQMS Centre to advance scientific knowledge of the factors restricting the performance of superconducting qubits and to create workable solutions for “1/f flux noise” reduction.
SQMS Internship
Creation of quantum computing systems for scientific purposes
The research of physics-based applications of quantum computing systems is going to be furthered by IBM and SQMS partners. For instance, in condensed matter physics, scientists want to investigate how IBM’s utility-scale processors can facilitate a simulation of quantum many-body dynamics with a complexity that gets close to a quantum advantage regime. Partners will investigate lattice quantum field theory simulations for high-energy physics.
Programmes for quantum workforce development
SQMS Centre launched numerous successful workforce development programmes, such as the U.S. Quantum Information Science School, which is shared with the other four National Quantum Information Science Research Centres (NQISRC) sponsored by DOE, in order to recruit and train the future generation of a diverse quantum workforce. With its extensive quantum education programme, IBM has made quantum education possible for millions of students worldwide.
SQMS Quality
By giving Fortune 500 firms, colleges, labs, and startups within the IBM Quantum Network the resources they need to develop their quantum workforce, the programme has also contributed to the provision of industry and domain expertise. IBM and SQMS intend to collaborate in order to enhance national quantum workforce development initiatives.
“IBM need to solve and scale complex challenges, like efficient, large-scale refrigeration and high-density and low-loss quantum interconnects, and advance their understanding of noise sources and how to reduce them,” stated Jay Gambetta, IBM Fellow and Vice President, IBM Quantum. “They are accelerating towards building a large-scale, fault-tolerant quantum computer.”
The intended involvement in the research of the SQMS Centre is a cornerstone in advancing our strategy for large-scale quantum computing. Together with pushing the boundaries of quantum hardware, IBM and Fermilab hope to advance scientific uses of quantum computing and develop a workforce prepared for the future of quantum computing.
A legal agreement between IBM and Fermi Research Alliance, LLC must be approved before the collaboration can commence.
SQMS Standards
One of the five National Quantum Information Science Research Centres of the U.S. Department of Energy is Superconducting Quantum Materials and Systems(SQMS) Centre. Fermi National Accelerator Laboratory leads SQMS, a coalition of over thirty national labs, colleges, and corporations. It aims to revolutionize quantum information science.
Utilising cutting-edge qubits and superconducting technology, the centre designs multiqubit quantum processor platforms by drawing on Fermilab’s experience in creating intricate particle accelerators. SQMS Centre will construct new quantum sensors and a quantum computer at Fermilab in close collaboration with embedded industry partners, creating previously unheard-of computing possibilities.
SQMS Centre
A National Quantum Information Science Research Centre of the Department of Energy
In the direction of the quantum future
As part of a nationwide effort to create and implement the most potent quantum computers and sensors on the planet, the U.S. Department of Energy funds five research centres, including the Fermi National Accelerator Laboratory-led (Superconducting Quantum Materials and Systems) SQMS Centre.
In a mission-driven, multidisciplinary collaboration, SQMS brings together over 500 experts from 34 partner institutions, national laboratories, academia, and industry. This collaboration integrates deep expertise in a variety of fields, including quantum information science, material science, computational science, particle and condensed matter physics, cryogenics, microwave devices and controls engineering, industry applications, and more.
SQMS Quantum Undergraduate Internship
Superfine-Q SRF caverns
With its world-record quality-factor superconducting radio-frequency cavities, the SQMS Centre is investigating the use of these structures as building blocks for quantum computing platforms, which have the potential to scale and increase performance tenfold over the most advanced commercial platforms now available. They are also investigating quantum transducers and memories based on SRF.Image Credit To IBM
Superconducting processors and qubits
SQMS’s goal is to significantly increase these devices’ performance. Collaborating closely with prominent figures in the quantum sector, we have established a nationwide task force focused on nanofabrication, utilising multiple research and production foundries. Their newly designed fabrication procedures have yielded consistent performance enhancements.
Comprehending quantum decoherence
SQMS researchers examine dissected cavities and qubits with different performance levels using a wide range of specialized material characterization techniques. By using these methods, researchers can improve the functionality of quantum devices by learning more about the atomic and nanoscale processes that restrict quantum coherence.
Benchmarking, simulations, and algorithms
Scholars are customizing algorithms to effectively handle data from the SQMS SRF QPUs, investigating the application of commercial quantum platforms, and comparing the computational capacities of various hardware. Applications include finance and MRI, as well as simulations of fundamental physics for high-energy and condensed-matter physics.
Fundamental physics through quantum sensing
Image Credit To IBM
The SQMS Centre‘s high-coherence gadgets, with their extreme sensitivity, open up new platforms that stretch into uncharted regimes. The main areas of research are tests of quantum mechanics, dark matter candidates, gravitational waves, particles beyond the Standard Model, measurements of fundamental properties at the precision frontier, and assessments of the benefits of quantum sensing systems in these experiments.
The quantum ecosystem
The next generation of researchers will be trained and educated by SQMS, which is building a community and a space to further the area of quantum information science. This is achieved by creating possibilities and utilising alliances across the diverse SQMS network in the Chicago region, Illinois as a whole, and beyond. The SQMS Centre wants everyone to have access to quantum information science.
Increasing milli-kelvin cryogenics’ scale
SQMS Centre is utilising Fermilab’s unique resources and cryogenics expertise to construct the largest and highest cooling power dilution refrigerator in the world. Additionally, engineers are creating vital technologies that will enable the scalability of future massive quantum computing data centres.
Read more on govindhtech.com
0 notes
govindhtech · 1 year ago
Text
IBM & Pasqal: Quantum Centric Supercomputing Breakthrough
Tumblr media
Quantum centric supercomputing
Leading innovators in neutral atom-based quantum computing and superconducting circuit technology, IBM and Pasqal, respectively, today announced their intention to collaborate in order to create a shared strategy for quantum-centric supercomputing and advance application research in materials science and chemistry. To provide the groundwork for quantum-centric supercomputing the fusion of quantum and sophisticated classical computing to build the next generation of supercomputers IBM and Pasqal will collaborate with top high-performance computing institutes.
Together, They hope to establish the software integration architecture for a supercomputer focused on quantum computing that coordinates computational processes between several quantum computing modalities and sophisticated classical compute clusters. The two businesses have the same goal of using open-source software and community interaction to drive their integration strategy. A regional HPC technical forum in Germany is set to be co-sponsored by them, with intentions to expand this initiative into other regions.
The joint goal of IBM and Pasqal to promote utility-scale industry adoption in materials research and chemistry a field where quantum-centric supercomputing exhibits immediate promise is a crucial component of this partnership effort. Through the utilisation of their respective full-stack quantum computing leadership roles and collaboration with IBM’s Materials working group, which was founded last year, Jointly they want to significantly improve the usage of quantum computing for applications in chemistry and material sciences. The team will keep investigating the most effective ways to develop workflows that combine quantum and classical computing to enable utility-scale chemistry computation.
High-performance computing is heading towards quantum-centric supercomputing, which can be used to achieve near-term quantum advantage in chemistry, materials science, and other scientific applications. IBM can ensure an open, hardware-agnostic future that benefits IBM’s clients and consumers more thanks to IBM’s relationship with Pasqal.”I am excited that will be working with us to introduce quantum-centric supercomputing to the global community,” stated Jay Gambetta, Vice President of IBM Quantum and IBM Fellow.
As Pasqal start collaboration with IBM, this marks a significant turning point for the quantum computing industry. Pasqal is excited to pool IBM’s resources in order to pursue a very ambitious objective: the establishment of commercial best practices for quantum-centric supercomputing. By utilising the advantages of both technologies, Pasqal is prepared to match the accelerating pace of Pasqal’s customers needs and meet their growing demands.
Concerning IBM
Globally, IBM is a leading provider of hybrid cloud technologies, AI, and consulting services. Pasqal support customers in over 175 countries to take advantage of data insights, optimise business operations, cut expenses, and obtain a competitive advantage in their sectors. Red Hat OpenShift and IBM’s hybrid cloud platform are used by over 4,000 government and corporate entities in key infrastructure domains including financial services, telecommunications, and healthcare to facilitate digital transformations that are swift, secure, and efficient. Open and flexible alternatives are provided to IBM’s clients via IBM’s ground-breaking advances in AI, quantum computing, industry-specific cloud solutions, and consultancy. IBM’s longstanding dedication to transparency, accountability, inclusion, trust, and service supports all of this.
Pasqal
Leading provider of quantum computing, Pasqal constructs quantum processors from ordered neutral atoms in 2D and 3D arrays to give its clients a useful quantum edge and solve issues in the real world. In 2019, It was established by Georges-Olivier Reymond, Christophe Jurczak, Professor Dr. Alain Aspect, who was awarded the Nobel Prize in Physics in 2022, Dr. Antoine Browaeys, and Dr. Thierry Lahaye, from the Institut d’Optique. To date, It has raised more than €140 million in funding.
Overview of IBM and Pasqal’s Collaboration:
Goal
The goal of IBM and Pasqal’s partnership is to investigate and specify the integration of classical and quantum computing in quantum-centric supercomputers. The advancement of quantum computing technologies and their increased applicability for a wide range of uses depend on this integration.
Classical-Quantum Integration
While quantum computing is more effective at solving some complicated issues, classical computing is still used for handling traditional data processing tasks. Creating hybrid systems that take advantage of the advantages of both classical and quantum computing is part of the integration process.
Quantum-Centric Supercomputers:
Supercomputers with a focus on quantum computing that also use classical processing to optimise and manage quantum operations are known as quantum-centric supercomputers. The objective is to apply the concepts of quantum mechanics to supercomputers in order to increase their performance and capacities.
Possible Advantages
Innovations in fields like materials science, complex system simulations, cryptography, and medicine may result from this integration. These supercomputers can solve problems that are now unsolvable for classical systems alone by merging classical and quantum resources.
Research & Development
IBM and Pasqal will work together to develop technologies, exchange knowledge, and undertake research initiatives that will enable the smooth integration of classical and quantum computing. To support hybrid computing models, hardware, software, and algorithms must be developed.
Long-Term Vision
This collaboration’s long-term goal is to open the door for a new generation of supercomputers that can meet the ever-increasing computational demands of diverse industrial and research domains.
Read more on Govindhtech.com
0 notes
govindhtech · 1 year ago
Text
Quantum HPC Ushers in a New Era of Quantum Computing
Tumblr media
Quantum HPC is a new discipline that combines quantum computing with high-performance computing (HPC).
Breakdown of the idea:
HPC systems use parallel processing to solve complicated problems that single computers cannot. Scientific models, medicine research, and weather forecasting use them extensively.
Quantum Computing: Quantum computers compute using quantum mechanics. Traditional computers utilise bits (0 or 1), but quantum computers employ qubits, which can be superposed. This makes them faster than classical computers at solving specific tasks.
HPC Quantum Computing
While powerful, older HPC systems have task constraints. But quantum computers excel at tackling problems that traditional computers find exponentially complex. This gap is addressed by quantum HPC:
Quantum Circuit Optimisation: HPC systems can build and optimise quantum circuits for quantum computers, enhancing efficiency and accuracy.
Verifying outcomes: HPC can verify small-scale quantum computation outcomes. Researchers are developing hybrid algorithms that use classical and quantum computers to solve complicated issues.
Quantum HPC benefits:
This connection could revolutionise fields like:
Materials Science: Simulating complicated materials for medicine, battery, and catalyst development.
Financial modelling: Portfolio optimisation and risk management.
Cryptography: Breaking encryption and creating post-quantum cryptography.
Current Status:
Quantum HPC is young. Maintaining quantum coherence and scaling quantum computers are continuing scientific issues. Japanese organisations like RIKEN Centre for Computational Science are creating quantum-HPC hybrid platforms.
Future Outlook:
Quantum HPC could boost scientific discoveries and technological innovation. We may expect to see more progress in merging classical and quantum computing to solve the world’s hardest issues.
IBM Quantum system has been chosen by RIKEN to be built into the supercomputer Fugaku.
RIKEN is a Japanese national research centre that has agreed to use IBM’s best-performing quantum processor and next-generation quantum computer architecture at the RIKEN Centre for Computational Science in Kobe, Japan. This is the one and only occasion that the supercomputer Fugaku and a quantum computer will be together.
A division of the Japanese Ministry of Economy, Trade, and Industry (METI), NEDO backs the “Development of Integrated Utilisation Technology for Quantum and Supercomputers” as a component of the “Project for Research and Development of Enhanced Infrastructures for Post-5G Information and Communications Systems.”” This agreement was signed as part of RIKEN’s ongoing project.
IBM Quantum System 2
IBM’s Quantum System Two design is only used by RIKEN for the purposes of putting its project into action. RIKEN, SoftBank Corp., the University of Tokyo, and Osaka University are working together on a project to show how useful these kinds of hybrid computing platforms will be for providing services in the future after 5G. The goal is to improve business and science in Japan.
In addition to the project, IBM will work on building the software stack needed to create and run integrated quantum-classical workflows in a quantum-HPC hybrid system with different types of computers. With these new features, algorithm quality and processing times will be able to be made better.
IBM plans to introduce its next-generation quantum computing architecture with IBM Quantum System Two, which will be installed at RIKEN and connected to Fugaku. This system will include expandable cryogenic infrastructure, modular quantum control electronics, and advanced system software to provide quantum computing services that work with traditional HPC services. These are the core parts of IBM’s vision for quantum-centric supercomputing.
Quantum Centric Supercomputing
Quantum-centric supercomputing is made possible by combining quantum and classical computing tools and using them to run computations faster than ever before. IBM thinks that quantum computing will be an important part of the design of quantum-centric supercomputing, which is the future of traditional HPC. And IBM Quantum System Two is one of the most important parts of this design.
A 133-qubit “IBM Quantum Heron” processor will run the system. The IBM Heron is the first in a new line of quantum processors. Its design brings the best performance of any IBM quantum processor released so far. Experiments done on IBM Heron had the lowest error rates of any IBM Quantum processor. This was five times better than the previous best records set by IBM Eagle, which can now be accessed through the cloud.
As the number of qubits grows and the accuracy gets better, NISQ’s advanced quantum computers are now ready to be used in real life. In the eyes of the HPC, quantum computers are machines that speed up science programmes that are usually run on supercomputers and make it possible to do calculations that supercomputers can’t do yet.
The head of the Quantum HPC Collaborative Platform Division at the RIKEN Centre for Computational Science, Dr. Mitsuhisa Sato, said, “RIKEN is committed to developing system software for quantum-HPC hybrid computing by leveraging its extensive scientific research capabilities and experience in the development and operation of cutting-edge supercomputers such as Fugaku.”
“IBM’s agreement with RIKEN is a huge step towards a future dominated by quantum-centric supercomputing,” said Jay Gambetta, IBM Fellow and Vice President, IBM Quantum. “It is the first quantum system that will directly connect with the Fugaku classical supercomputer.” “This work moves the field closer to a modular and adaptable structure that combines quantum computing and communication with traditional computing resources. This way, both can be used together to solve problems that get harder to solve.”
IBM has advanced significantly in artificial intelligence, quantum computing, cloud solutions tailored to specific industries, and consulting. These innovations give our customers a lot of open and flexible choices. IBM has a long history of committing to trust, openness, responsibility, inclusion, and service.
Regarding RIKEN
The largest research institute in Japan for both basic and applied study is RIKEN. The top scientific and technology journals publish around 2500 articles authored by RIKEN researchers annually. The subjects covered in these papers are many and include biology, chemistry, physics, engineering, and medical research. RIKEN is renowned for its outstanding research all over the world because of its focus on globalisation and multidisciplinary cooperation.
Read more on govindhtech.com
0 notes
govindhtech · 2 years ago
Text
Tokyo’s Tech Triumph: 127-Qubit IBM Quantum Eagle Processor
Tumblr media
IBM Quantum Eagle Processor, 127-Qubit
UTokyo and IBM announced the deployment of a 127-qubit IBM Quantum Eagle processor in Japan’s first IBM Quantum System One today. Researchers from Quantum Innovation Initiative (QII) Consortium institutions want to use the system’s new processor for quantum research in bioinformatics, high-energy physics, materials science, and finance.
The region’s first utility-scale processor is the 127-qubit IBM Quantum Eagle Processor. IBM describes ‘utility-scale’ as the point where quantum computers can be used to study new issues. IBM and UC Berkeley scientists released findings in Nature in June showing that quantum computers may yield results at a scale of more than 100 qubits, surpassing classical techniques.
“For the first time outside North America, a quantum computer with a 127-qubit processor is available for exclusive use with QII members,” said UTokyo Executive Vice President Hiroaki Aihara. “A supercomputer can simulate 50 qubits, and that allows large-scale and complex calculations that would be impossible without a quantum computer. IBM aspire to contribute to a diverse, hopeful future society by fostering research in several sectors and implementing quantum-related technology socially.
Leading utility-scale research in Japan
Since joining the IBM Quantum Network in 2019, UTokyo has expanded Japanese quantum computing access. The 2020 Japan-IBM Quantum Partnership effort, which included the QII Consortium, aims to increase industry-academia-government collaboration to advance Japan’s quantum scientific, business, and education leadership.
With a utility-scale IBM Quantum System One using more powerful quantum technology, including advanced hardware and tools to explore how error mitigation can enable accuracy, UTokyo joins other pioneering organizations and universities in IBM’s recently established working groups to advance quantum computing, including Healthcare and Life Sciences, where UTokyo and QII scientists will conduct exploratory bioinformatics research.
“We are excited to collaborate with QII Consortium organizations on the problems we anticipate will push the limits of today’s quantum systems and begin to extract scientific and business value by equipping UTokyo with a utility-scale IBM Quantum Systems One,” said IBM Fellow and Vice President, IBM Quantum, Jay Gambetta.
About Tokyo University
Japan’s premier university and a prominent research university is Tokyo University. About 6,000 scholars publish their work in the world’s leading arts and sciences journals. Their vibrant 15,000 undergraduate and 15,000 graduate students include almost 4,000 international students.
About IBM
IBM is a global leader in hybrid cloud, AI, and consulting. IBM enable clients in over 175 countries maximise data insights, improve business processes, cut expenses, and compete in their industries. IBM’s hybrid cloud platform and Red Hat OpenShift help over 4,000 government and corporate entities in critical infrastructure sectors including financial services, telecommunications, and healthcare modernize rapidly, securely, and efficiently. IBM’s breakthrough AI, quantum computing, industry-specific cloud solutions, and consultancy give clients flexible options. IBM’s longstanding commitment to trust, openness, accountability, inclusion, and service supports this.
Read more on Govindhtech.com
0 notes