#Quantum Approximate Optimisation Algorithm
Explore tagged Tumblr posts
Text
QAOA For Traffic Jams: A Hybrid Quantum Algorithm Approach

Recent research considers hybrid quantum algorithms, notably QAOA, to solve traffic congestion through optimised route planning.
In an arXiv post, Ford Motor Company and University of Melbourne researchers proved that a hybrid quantum algorithm may minimise traffic bottlenecks. Despite noisy hardware and limited circuit depth, their solution outperformed quantum methods on current real-world CPUs.
The paper examines the Quantum Approximate Optimisation Algorithm (QAOA), a potential quantum tool. In optimisation circumstances, QAOA is ideal for finding the best answer among numerous possibilities. It works effectively for city traffic management to reduce highway congestion.
Researchers created a mathematical Quadratic Unconstrained Binary Optimisation (QUBO) model to solve the traffic problem. They created variables that represent each vehicle's probable pathways and assigned a cost that climbs when more vehicles utilise the same road stretch. This model accommodates real-world constraints including the necessity for each car to travel exactly one route and penalises busy routes to prevent overlap. We get a cost function that QAOA can solve and map onto a quantum system.
Team writes: “By defining decision variables that correspond to each car's route, the problem of reducing road congestion can be modelled as a binary combinatorial optimisation problem.” We offered each car a list of possible routes from its starting point to its goal. Routes are walkways with road-like borders. Every route begins and ends at the origin and destination nodes, which were always intersections for simplicity.
Using QAOA for Cheap Solutions
The group utilised QAOA to find cheap fixes after encoding. QAOA circuits use alternating layers of quantum gates and need precise parameter adjustments. Optimising these features is difficult, especially with noise. A major contribution of the work was evaluating precomputed values, random estimates, and quantum annealing-influenced approaches for initialising these parameters.
Trotterized Quantum Annealing (TQA) outperformed conventional initialisation approaches. Since it models the system's steady progression from basic to sophisticated, TQA often yields better results than random starting. The researchers also found that precalculated parameters from simulations of similar traffic circumstances often yielded results virtually as good as fully optimised trials at a far lower processing cost.
Make Noise
After verifying their technique in simulations, the researchers ran their QAOA circuits on IBM quantum hardware. Noise and hardware limits limited performance as expected. Standard QAOA circuits need two-qubit operations between qubits that may not be physically linked on the semiconductor. To fix this, devices utilise “SWAP” gates to shuffle data among qubits, but this adds overhead and inaccuracy.
To avoid SWAP operations in two-qubit gates, the researchers created Connectivity-Forced QAOA (CF-QAOA). Despite changing the quantum circuit and presumably diminishing its precision, eliminating these gates enhanced noisy device performance.
They added greater customisability to CF-maQAOA, a second version that compensates for missing gates. This strategy outperformed traditional optimisation in practice while being more sophisticated.
The researchers also examined how their strategy develops with the challenge. QAOA was compared against Gurobi, a commercial classical solver known for its optimisation performance. When there were more cars and variables, standard QAOA ran slower than Gurobi. The CF-QAOA technique showed comparable scaling patterns after noise correction.
Work to Come
Using quantum computers to minimise traffic congestion may lower the environmental impact of idle cars and help always-late people make their appointments.
Due to quantum gear constraints, the research acknowledges that these developments won't improve matters in the future. Deep circuits create too much noise, and even little qubit connection changes can affect findings. Further study is needed to establish if improved compression approaches may retain performance while reducing complexity and how circuit simplifications influence quantum properties like entanglement.
The study concludes that hybrid quantum optimisation is effective for traffic control. By adjusting algorithms to hardware restrictions and accepting approximate answers, the researchers demonstrate that quantum computing can develop even in the noisy, pre-error-corrected period.
The University of Melbourne established an IBM Quantum Network Hub to aid this effort.
#technology#technews#govindhtech#news#technologynews#QAOA#Quantum Approximate Optimisation Algorithm#Trotterized Quantum Annealing#Noise#CF-QAOA#CF-maQAOA
0 notes
Link
1 note
·
View note
Text
History of SEO
SEO, as we all know, is a vast, constantly-evolving field of digital marketing that strengthens brand voice, and boosts website traffic and ROI (return on investment). Its inexpensive yet highly effectual nature has led to its increased use by search marketing professionals who are up to speed with the latest happenings in the field. However, seldom does one wonder about how it all began. Although there is no concrete literature pointing to a specific time when SEO first came into existence, most industry experts believe that it all started in 1991. We have come a long way since, with huge, swanky websites crowding the internet and clamouring for attention from users. Add to this the ever-changing ways of search engines and the result is a quantum shift in the way online search works. This in turn has led to the evolution of search engine optimisation (SEO) techniques over the years. If you have forever been curious about what search marketing experts did differently back in the day, read on for a quick trip down the SEO memory lane.
The Birth of the Internet
The number of internet users has skyrocketed in the past few years and is increasing every single day. Users are constantly searching for all the information they need through various devices, and search engines are pulling out all the stops to provide them with the most relevant, contextual results. But have you ever wondered when and how it all began? Remote computers could connect over basic telephone lines with the AT&T commercial modem that was launched in 1958. This was the most nascent form of the internet, with the term ‘internet’ being coined much later in December 1974.
The Very First Search Engines
With time came a barrage of online information that needed to be organised and indexed in an efficient way so that its retrieval was uncomplicated and useful. Although Dr. Vannevar Bush, the Office of Scientific Research and Development’s director in the USA, had conceptualised a directory or database for the world’s data in 1945, it was not until 1990 that the first search engine, Archie, was invented by Alan Emtage. It was born out of a school project and indexed FTP (File Transfer Protocol) files based on text.
The discovery of a few other early search engines is as described below:
Archietext was created by a group of six Stanford University students in February 1993. It later evolved into the search engine, Excite, which was officially introduced in 1995. It worked by sorting online information/content based on the keywords found in it.
The World Wide Web Wanderer, later called the Wandex, debuted in June 1993 under the leadership of Mathew Grey.
Aliweb was launched in October 1993 by Martijn Koster, and permitted the submission of webpages by their owners.
December 1993 saw the unveiling of three search engines – World Wide Web Worm, RBSE spider and JumpStation– that used web robots to crawl various sites.
1994 was a big year that witnessed the launch of four popular search engines, namely Yahoo, Alta Vista, Lycos and Infoseek.
AskJeeves was introduced to the world in April 1997 and later came to be known as Ask.com.
The domain Google.com was registered in September 1997.
Google Reigns Supreme
In 1998, Sergei Brin and Lawrence Page, the creators of Google, published a paper titled ‘The Anatomy of a Large-Scale Hypertextual Web Search Engine’ as part of their research project while studying at Stanford University. In it, they wrote ‘the predominant business model for commercial search engines is advertising. The goals of the advertising business model do not always correspond to providing quality search to users.’ This was what many consider a history-making moment as they went on to elaborate on PageRank, a technology that Google employed to judge how relevant a webpage was to a search query and rank it. It is important to note here that it did so based on content quality and not just the search keyword.
Although Yahoo was the result of a campus trailer project by creators David Filo and Jerry Wang, it soon gained prominence for being a directory of useful sites and internet bookmarks. Web publishers would put their webpages up for scrutiny so that it would index them and make them available to searchers everywhere. In 2000, Yahoo committed a strategic blunder by striking up a partnership with Google to manage its organic search. This in turn led to every search result bearing the ‘Powered by Google’ tag, thereby catching the eye of many a Yahoo user. Thus, Yahoo helped Google script its success story by giving it an early launchpad at the turn of the century.
It is important to note here that there were many reasons for Google’s undeniable success. Until 2000, websites were ranked based on many now-archaic and inadequate practices of bread-crumbing (weblinks that indicate your site’s structure), and on-page content, among others. Google, however, analysed on-page and off-page activities side by side before deciding the rank of a webpage in its SERP. SEO professionals around the globe misinterpreted this and considered links to be the end-all of gaining a good spot in Google’s SERPs. Link-building became a rampant black-hat tactic that was addressed by Google in the coming years. The Google Toolbar was launched as an additional feature on the Internet Explorer and showed web publishers their PageRank score, a measure of how important a webpage was.
Besides the aforementioned differentiators that made it a legendary information retriever, Google has always had a reputation for introducing frequent and game-changing updates, the most major of which are the following:
2005’s Jagger update ensured that anchor text was no longer as important a factor in determining a webpage’s rank. It also helped Google make great headway in stemming the exchange of random, unsought links.
2011’s Panda update was introduced to quash the problem of content farms, i.e. websites that churned out poor-quality, unoriginal and auto-generated content and made money from ads. This update deserves special mention as, after many changes over an extended period of time, it was incorporated into Google’s main algorithm.
2012 saw the release of the Penguin update, aimed at tackling dubious, spammy measures that included suspicious linking patterns within websites coupled with anchor text that contained keywords that web publishers intended to rank highly for.
2017’s Big Daddy update shed more light on weblinks and the relationship between sites.
Google’s world dominance has irked many a search competitor throughout its long epoch. Yahoo, not one to stay beaten after the previous debacle of 2000, tried to undo its misstep by collaborating with Microsoft through a ten-year alliance in 2009. In a bid to loosen Google’s grip on approximately 70% of the American search market, Yahoo decided to have its paid and organic search fuelled by Bing, a derivative of Microsoft Live Search. This however did nothing to shake Google’s numero uno status and only strengthened Bing’s position as the second most popular search engine in the world. Additionally, the ten-year Microsoft and Yahoo partnership ended up being revisited midway, after just five years from its start.
We also provide SEO Services.
A Few Important Dates
The evolution of Google gave rise to changes in tactics used by SEO experts around the world to achieve high search rankings. The following are the most landmark moments in the history of SEO:
In 1997, the manager of the Jefferson Starship, a popular rock music band, was not too thrilled that their official website was not ranking on the first SERP. This was when the term ‘search engine optimisation’ was supposedly coined. Alternatively, it is believed that John Audette, the founder of MMG (Multi Media Marketing Group), while meeting with Danny Sullivan to convince him to join his company, first used the term. The latter conceived the idea of Search Engine Watch, an SEO news-provider and trend-spotter. A decade later, he founded the very well-known publication, Search Engine Land, after walking away from Search Engine Watch.
1998 witnessed the birth of Goto.com, which allowed website owners to bid on the space above the organic search results that were generated by Inktomi. These paid search and sponsored links were also shown adjacent to and below the organic search results. Yahoo finally absorbed Goto.com. This was also the year that MSN entered the search arena with MSN Search.
In 1999, the first official search marketing event was conducted as part of the Search Engine Strategies (SES) conference.
Google took over YouTube in 2006 for billions of dollars. The latter is highly popular with users with more than a billion of them overall. This led to SEO experts delving deep into video optimisation techniques and brands using the video-streaming platform to their advantage to create a strong online voice. Google Webmaster Tools (now called the Google Search Console) and Google Analytics were also launched in the same year, introducing web publishers to a whole new world of possibilities. With them, they could view the search keywords for which their sites ranked highly, errors in crawling or inclusion, user session length, bounce rate and so much more.
In 2000, Google AdWords was launched. This was also a time when webmasters around the world grew accustomed to the ‘Google Dance’, a period in which the search engine would release major updates to its indexing algorithm, leading to many shakeups in search rankings.
In 2003, Google took over Blogger.com and started dishing out contextual Google AdWords ads on various publisher websites through its new service, Google AdSense. Although it shaped the blogger revolution of the early 2000s, this plan of action was not a fool-proof one as many websites with mediocre and sometimes even plagiarised content started mushrooming overnight just to garner AdSense revenue.
In 2004, local search and personalisation became major trends, with search results bearing a geographic intent. Users’ previous search patterns and history coupled with their interests allowed Google to custom-make the SERP for every individual, meaning that two users would not see the same search results even if they had entered the same search query.
In 2005, no-follow tags, meant to push back spammy linking, were invented as a means to better shape PageRank.
2007 saw the birth of Google’s universal search wherein, instead of just plain, blue search results on the SERP, other fascinating features like news, images and videos were also included.
Google Instant was introduced in 2010 – it offered users relevant search query suggestions whenever they would type into the search bar. This was also when Google made it known that the speed of a website was a crucial ranking determinant.
Google’s Knowledge Graph was unveiled in 2012,resulting in a major step forward in understanding search intent. Here, the internet’s billions of websites can serve as Google’s knowledge database from which it draws the most relevant information in the form of knowledge carousels, boxes, etc.
A new Google algorithm update, the Hummingbird, was introduced in 2013 with an intent to redefine natural language or conversational search for mobile devices. This has been hailed as the biggest update to Google’s search algorithm since 2001.
In 2015, the number of mobile-only searchers eclipsed the number of desktop-only users. This was mainly due to the steady climb in mobile phone users who used their smart devices to retrieve useful information while on the move. The availability of robust wireless service providers was another added plus for such people. Thus, Google found it crucial to introduce a search algorithm update that would be mobile-friendly. In the same year, Google acknowledged that RankBrain was a major component of its main search algorithm. It is interesting to note here that this was perhaps the beginning of the AI (Artificial Intelligence) phase for Google as RankBrain uses Machine Learning to understand how it can provide the most relevant search results to user queries.
In 2016, Google launched the AMP (Accelerated Mobile Pages) which instantaneously loaded content. AMPs have been extensively used by media houses.
The Past, Present and Future of SEO
Many search engines have risen and bitten the dust over the years, unlike Google which seems to be going from strength to strength. Here is a comparison of SEO tactics through the ages and the impact they have had on the way search works across devices.
All Hail the King, Content!
While the 1990s saw the mindless stuffing of various important keywords into webpage content coupled with backend optimisation, these techniques soon came to be frowned upon. This was because Google was devising smarter ways to produce genuine, useful and relevant search results. Today, search intent is of top priority, with Google’s latent semantic indexing techniques producing contextually relevant search results based on user search history and preferences, device, location, etc. Images and videos, which were earlier thought to be unimportant, have now gained prominence, with high-quality images and video content helping webpages rank better on Google.
Social Media Marketing
The social media boom took over the internet only towards the early 2000s, gaining momentum as the years swung by. Today, a strategically-placed Facebook or Instagram post can get you an immediate jump in the number of website visitors. Twitter, LinkedIn and Quora have helped boost online sales and create a stronger, clearer brand voice. Even YouTube has now launched a social feature where one can connect with friends and acquaintances on the video-streaming platform. Having a large social media following or being acknowledged by/getting a shout out from those with an army of followers is everything. However, the emphasis on creating engaging, shareable content that is a value-add to various users remains as strong as ever.
Machine Learning
Machine Learning, a branch of Artificial Intelligence, has only recently emerged as a field of study. It uses data from past experiences or events to recognise patterns and perform various tasks without the need for explicit programming. Sundar Pichai, Google’s current CEO, has laid great emphasis on AI, stating that the search engine will be completely driven by it in the future. While many are opting for digital personal assistants like Alexa of Amazon or Apple’s Siri, chatting through chatbots, searching using images or even videos, the need for intelligent, contextual search is greater than ever before. And this can only happen with the integration of more and more AIML into Google’s core search algorithm.
Mobile Search
In 2015, comScore came out with a report that showed that the number of mobile-only users that had had overtaken the number of desktop-only users. Mobile-first indexing is quickly changing how websites are crawled by Google, with plans of indexing all new websites this way in the near future. Local SEO can help users discover local businesses closest to them. This means that web publishers and developers alike must pay attention to the mobile responsiveness of their website. This is only a recent development as there has been an unprecedented increase in the number of smartphone users in the past few years. In fact, those websites without proper mobile optimisation might experience a decline in their position in SERPs (search engine results pages).
Keywords and Link-building
New search features like Google’s Featured Snippet, Knowledge Graph, etc. have redefined the way information is discovered and used online. They have also given rise to more intense competition among websites to feature in the top spot on the SERP and grab the user’s attention. While SEO was all about focusing on the ranking of a webpage in the past, today, it is about so much more, including how users engage with a brand online. While SEO experts focused on singular keywords and spammy backlinks coupled with increased tagging and comments earlier, now, the onus lies on keyword intent, long tail searches (specific multi-word keyword/search phrases) and quality backlinks founded on relationship building. The birth of the first blogs in the late 1990s is of great significance as popular bloggers have since helped spread the word about new-fangled online businesses of their liking, convincing legions of followers to visit their website with well-placed backlinks inside appealing content.
Also, read about Technical SEO
The Future of SEO
In the past, a single, small search algorithm change would take long to implement, allowing many black-hat SEO techniques to help a webpage’s search rankings. Google is now vastly more vigilant and intelligent, constantly evolving to foster the steady growth of clean SEO tactics. Personalization is the name of the game, with users desiring more contextual, wholesome search results with the least effort from their side. Google might even use external platform data to accomplish this objective, something website owners can prepare for by optimising in-app content to make it more user-friendly and tailored to suit user intent. Google will recognise patterns by accumulating data from different digital platforms and social media channels, making staying consistent across all of them with a single brand voice a wise move. Featuring well-optimised content, especially high-resolution images and videos, across all devices and in all forms (we are talking local search, mobile sites and voice search) is prudent. Reaching out to your customer base for valuable feedback on your online presence and communicating clearly with content creators must be your prerogative.
0 notes
Text
Skoltech scientists break Google's quantum algorithm
Skoltech scientists break Google’s quantum algorithm
Google is racing to develop quantum enhanced processors that utilize quantum mechanical effects to one day dramatically increase the speed at which data can be processed.
In the near term, Google has devised new quantum enhanced algorithms that operate in the presence of realistic noise. The so called quantum approximate optimisation algorithm, or QAOA for short, is the cornerstone of a modern…
View On WordPress
0 notes
Text
Quantum computing explained: harnessing particle physics to work faster
New Post has been published on https://workreveal.biz/quantum-computing-explained-harnessing-particle-physics-to-work-faster/
Quantum computing explained: harnessing particle physics to work faster
In 2013, the D-Wave Two turned into mentioned in one test as performing 3,600 times quicker than a standard PC. However another time these results had been rubbished by way of several prominent scientists within the area. In 2014 Matthias Troyer, a renowned professor of computational physics, published a document that said that he determined “no evidence of quantum speedup”. Quantum computing harnessing particle physics to work faster.
quantum
A longtime doubter of D-Wave’s claims is Scott Aaronson, a professor at MIT, who has referred to as himself “Leader D-Wave Skeptic”. After Troyer’s paper, he argued that even though quantum effects had been likely taking the region in D-Wave’s gadgets, there was no cause to trust they performed a causal function or that they had been quicker than a standard laptop.
Brownell is dismissive of these critics, claiming that the “question has been to a significant part settled”. He cites Google’s comparative check last yr in which its D-Wave quantum computer solved certain issues 100m instances quicker than a standard laptop.
“If it isn’t quantum computing,” asks Brownell, “then how did we construct something that’s a hundred million instances faster than an Intel Center? It either needs to be quantum computing or a few other law of nature that we haven’t observed, but that’s even greater compelling than quantum mechanics. I assignment any scientist within the world to tell us: if it’s now not quantum annealing, what is it?”
Even Aaronson acknowledged that the Google test became enormous. “That is certainly the most incredible demonstration so far of the D-Wave system’s talents. And yet,” he added, “it stays entirely unclear whether you may get to what I’d do not forget ‘real quantum speedup’ using D-Wave’s architecture.”
But Troyer changed into no longer convinced. “You need to study the satisfactory print,” he said. “That is 108 instances faster than a few particular standard set of rules on problems designed to be very hard for that set of standards But clean for D-Wave… A claim of 108 speedups is therefore very deceptive.”
One facet-benefit of a lot of these claims and counter-claims, as Aaronson has maximum forcefully argued, is they assist us to recognise quantum mechanics a little higher. Nic Harrigan works at the Centre for Quantum Photonics at Bristol College; main studies institute in quantum mechanics.
“even though there are notable potential programs of quantum computing,” Harrigan says, “even if no person ever builds a beneficial quantum laptop we nevertheless analyse an enormous amount via attempting to. This could sound like ass-covering, But quantum mechanics itself is a theory so essential to our knowledge of the universe and is the seed to such a lot of modern-day and future different technology, that anything we will do to recognise it higher is huge. Enough it turns out that an excellent manner to try and understand what is taking place in quantum mechanics (and simply the way it differs from conventional classical physics) is to recollect which kind of computational troubles it is easy to more without difficulty solve using quantum mechanical systems.”
At Google, they had been carefully positive about D-Wave’s usefulness. The head of engineering, Hartmut Neven, outlined the strengths and weaknesses of the tests and mentioned that even as there were other algorithms that, if deployed on traditional computer systems, should outperform quantum annealing, the anticipated future traits to favour quantum annealing. “The design of subsequent technology annealers should facilitate the embedding of issues of realistic relevance,” he said.
The types of troubles that quantum annealing might assist address are all concerned with what’s known as optimisation – finding the maximum green model in complex structures.
“Optimisation appears like a patently uninteresting problem,” says Brownell, “However it’s at the Core of such a lot of complicated application troubles in each field. Likely one of the most exciting is the artificial intelligence international. Say you’re looking to recognise a water bottle. It nevertheless takes computers an enormous amount of time to do this no longer as well as people do. Computers are catching up, But quantum computing can assist accelerate that manner.”
He cites genomics, economics and medication as other regions that are rich with optimisation troubles. With conventional computers, creating complex models – including, for example, the Monte Carlo simulation used inside the finance enterprise to examine extraordinary interest charge scenarios – requires a significant amount of computing energy. And computing power calls for real power.
“You go to those big net properties, and they have information centres which can be set up next to hydroelectric flora because they eat so much strength,” says Brownell. “They’re the second-largest consumers of electrical power on the planet.”
D-Wave’s vision, he says, is for a green revolution in computing, wherein everybody could have to get entry to lots other energy-efficient quantum computer systems via the cloud. In some years, he thinks, we’ll be capable of getting admission to quantum computing from our phones.
“I assume we have the opportunity to create one of the maximum precious technology businesses in records,” says Brownell. “I recognise that sounds admittedly grand, However primarily based on the functionality we’ve built, we’re on the stage to be the dominant participant in quantum computing for decades to return.”
Well, any self-respecting CEO might say that. However, it plainly appears that D-Wave is presently leading the quantum computer race. Wherein that race goes, what it includes and how many universes it’s taking place in are, however, questions that we’ll in all likelihood want a running quantum PC to reply.
Round the world groups of scientists are running on the next technological revolution: quantum computing. But what makes it so unique? And why will we need it? We ask physicist Dr Ruth Oulton of the Bristol College to explain.
In a regular PC, information is saved as bits. How is it one of a kind in a quantum laptop? A typical laptop has bits and each bit [is either] 0 or one. A quantum computer has quantum bits. Those are created from quantum particles that may be 0, one, or some sort of state in among – [in other words they can have both values] at the identical time.
particle
So a quantum bit is crafted from a solid particle?
It quite a great deal can be any fundamental particle so that it can be a photon or an electron, or it may be a nucleus, as an example. It is a particle that could have Two one of a kind residences [at once]. [For example], the particle may be in each one area and the alternative region on the equal time.
How does this help with computing? In a regular PC, a selected calculation would possibly undergo all the distinctive possibilities of zeros and ones for a particular calculation. Because a quantum computer can be in all of the states at the same time, you simply do one calculation [testing a vast number of possibilities simultaneously]. So it could be tonnes quicker.
What’s the largest challenge?
You want an excellent manipulate over character particles. You can’t just shove [all the particles] collectively due to the fact they might have interaction with each different [in an unpredictable way]. You need for you to trap and direct them, But while the debris interact [with the trap itself] it makes them lose their information, so you need to make certain that you design the trap Well.
What are the programs?
The most significant and maximum crucial one is the potential to factorise a very massive number into Two prime numbers. It truly is clearly critical because that’s what nearly all encryption for net computing is based totally on. A quantum PC need to be able to try this exceptionally speedy to get returned the prime numbers and on the way to imply that basically whatever that has been with [that] encryption may be de-encrypted. In case you had been to do it with the classical computers we’ve now, it would take longer than the age of the universe to move returned.
Are there other clinical makes use of?
Calculating the positions of person atoms to a great molecules like polymers and viruses. The manner that the debris interacts with each other – there is such a lot of different possibilities that commonly they are saying that you can’t calculate whatever properly [with] extra than approximately ten atoms within the molecule. So when you have a quantum laptop, you could use it to develop capsules and apprehend how molecules work a piece better.
Are there business quantum computers?
There is an industrial PC available, But It is very expensive ($10m), it has very limited computing energy, and it hasn’t yet been proven by using everybody externally [as to] what It is genuinely doing.
Will quantum computers appear to be our desktops and laptops do now?
We’re re-designing the laptop. The first actual quantum computer systems will possibly fill a room. It’ll take us some time to get to computers. Without a doubt, absolutely what will occur [is] you’re going to have a hybrid computer with a quantum chip and a classical chip.
The yearly Royal Society Summer Science Exhibition indicates off the best of British Technological know-how, highlighting the place of scientific innovation at the coronary heart of our lifestyle, and of our economic well-being.
The exhibition dates lower back to the early nineteenth century when the Royal Society’s president guests to his home to look into collections of scientific contraptions and different objects illustrating the latest clinical research.
These days It’s an exhibition with a huge range of events, and in this and subsequent week’s podcast, we’ll be looking at 4 of them.
This week we are going to explore the effect maths and logic has had on present-day computing, and whether quantum computing is a reasonable prospect.
computing
Ian Sample is joined down the line by Vlatko Vedral, Professor of Physics at Oxford College. In the studio is Patrick Fitzpatrick, emeritus professor of mathematics at University College Cork, the Mother or father’s Technology correspondent Hannah Devlin, and Phil Oldfield, our British Science Affiliation media fellow.
Patrick Fitzpatrick was speaking at the Royal Society alongside Emanuele Pelucchi, Head of the Science Foundation Ireland Fundamental Investigator Grant Organization at Tyndall Countrywide Institute-College College Cork.
0 notes
Text
Craig Gidney Quantum Leap: Reduced Qubits And More Reliable

A Google researcher reduces the quantum resources needed to hack RSA-2048.
Google Quantum AI researcher Craig Gidney discovered a way to factor 2048-bit RSA numbers, a key component of modern digital security, with far less quantum computer power. His latest research shows that fewer than one million noisy qubits could finish such a task in less than a week, compared to the former estimate of 20 million.
The Quantum Factoring Revolution by Craig Gidney
In 2019, Gidney and Martin Ekerå found that factoring a 2048-bit RSA integer would require a quantum computer with 20 million noisy qubits running for eight hours. The new method allows a runtime of less than a week and reduces qubit demand by 95%. This development is due to several major innovations:
To simplify modular arithmetic and reduce computing, approximate residue arithmetic uses Chevignard, Fouque, and Schrottenloher (2024) techniques.
Yoked Surface Codes: Gidney's 2023 research with Newman, Brooks, and Jones found that holding idle logical qubits maximises qubit utilisation.
Based on Craig Gidney, Shutty, and Jones (2024), this method minimises the resources needed for magic state distillation, a vital stage in quantum calculations.
These advancements improve Gidney's algorithm's efficiency without sacrificing accuracy, reducing Toffoli gate count by almost 100 times.
Cybersecurity Effects
Secure communications including private government conversations and internet banking use RSA-2048 encryption. The fact that quantum-resistant cryptography can be compromised with fewer quantum resources makes switching to such systems more essential.
There are no working quantum computers that can do this technique, but research predicts they may come soon. This possibility highlights the need for proactive cybersecurity infrastructure.
Expert Opinions
Quantum computing experts regard Craig Gidney's contribution as a turning point. We offer a method for factoring RSA-2048 with adjustable quantum resources to bridge theory and practice.
Experts advise not panicking immediately. Quantum technology is insufficient for such complex tasks, and engineering challenges remain. The report reminds cryptographers to speed up quantum-secure method development and adoption.
Improved Fault Tolerance
Craig Gidney's technique is innovative in its tolerance for faults and noise. This new approach can function with more realistic noise levels, unlike earlier models that required extremely low error rates, which quantum technology often cannot provide. This brings theoretical needs closer to what quantum processors could really achieve soon.
More Circuit Width and Depth
Gidney optimised quantum circuit width (qubits used simultaneously) and depth (quantum algorithm steps). The method balances hardware complexity and computing time, improving its scalability for future implementation.
Timeline for Security Transition
This discovery accelerates the inevitable transition to post-quantum cryptography (PQC) but does not threaten present encryption. Quantum computer-resistant PQC standards must be adopted by governments and organisations immediately.
Global Quantum Domination Competition
This development highlights the global quantum technological competition. The US, China, and EU, who invest heavily in quantum R&D, are under increased pressure to keep up with computing and cryptographic security.
In conclusion
Craig Gidney's invention challenges RSA-2048 encryption theory, advancing quantum computing. This study affects the cryptographic security landscape as the quantum era approaches and emphasises the need for quantum-resistant solutions immediately.
#CraigGidney#Cybersecurity#qubits#quantumsecurealgorithms#cryptographicsecurity#postquantumcryptography#technology#technews#technologynews#news#govindhtech
2 notes
·
View notes
Text
Quantum Annealing Correction Tackles Spin-Glass Problems

Quantum-Annealing Correction
Scalable Spin-Glass Optimisation Benefits from Quantum Annealing Correction
Quantum annealing, a cutting-edge computer approach that uses quantum evolution to find low-energy states, has long showed promise for tackling optimisation problems. However, noise and decoherence have generally hindered practical solutions, limiting scalability and performance. Recent pioneering work on Quantum Annealing Correction (QAC) has shown that quantum annealers scale better than the best conventional heuristic techniques for several tough issues. The first algorithmic quantum speedup in approximate optimisation has been shown.
The unique error-suppression approach Quantum Annealing Correction was meticulously developed to improve quantum annealing performance and durability. Directly incorporating a bit-flip error-correcting code with energy penalties into quantum annealing achieves this. Three physical “data qubits” represent a single logical qubit in the Quantum Annealing Correction encoding. Importantly, each data qubit has an extra “energy penalty qubit” with a defined coupling strength, $J_p$.
The logical qubit's state is determined by a data qubit majority vote. This advanced implementation uses the D-Wave Advantage quantum annealer's Pegasus graph to create more than 1,300 error-suppressed logical qubits on a degree-5 interaction graph, allowing for the resolution of large problems with 142 to 1,322 qubits.
Quantum Annealing Correction strongly demonstrated its benefits using high-precision spin-spin interactions in 2D spin-glass situations. Spin-glass situations are ideal for testing an algorithm's ability to handle complex solution spaces due to their complicated energy landscapes with many local minima. Sidon-28 (S28) disorder requires precise interaction values, hence the effort focused on it. These examples are especially susceptible to analogue coupling defects, or “J-chaos,” therefore QAC's error-suppression capabilities were expected to be beneficial.
Quantum Annealing Correction was compared to Parallel Tempering with Isoenergetic Cluster Moves, the best classical heuristic solution for spin-glass problems. By modelling several systems with periodic state swaps at different temperatures, PT-ICM avoids local minima and enhances optimisation efficiency. Performance was measured using the time-to-epsilon (TT$\epsilon$) metric, which prioritises speed over accuracy and prioritises approximation responses within a certain error tolerance.
This metric optimises noise and annealing schedules to boost performance. They show that quantum annealing scales better than PT-ICM with Quantum Annealing Correction, especially for low-energy states with an optimality gap of at least 1.0%.
Quantum Annealing Correction outperformed PT-ICM at a smaller optimality gap of 0.85%, scaling best among quantum approaches. All quantum approaches, including QAC, C3, and their fast-schedule counterparts, reduced algorithmic runtime by four orders of magnitude compared to PT-ICM, the study found. Due to CPU speeds, this absolute speedup was not the major focus for claiming a robust scaling advantage.
QAC suppresses errors better than classical repetition coding (C3), which is why it works. The baseline C3 technique encodes issues on the logical Quantum Annealing Correction graph by deactivating the penalty coupling ($J_p = 0$), creating three parallel, uncoupled copies of the problem instance that are used to extract quantum annealing samples. Although C3 provides some fundamental parallelism, the findings confirm past studies on the effect of analogue coupling flaws (“J-chaos”) on quantum annealing performance. QAC consistently outscales C3.
These defects demonstrate the importance of improved error correction and suppression, especially in high-precision settings like S28 spin-glass. A Kibble-Zurek (KZ) ansatz investigation of dynamical critical scaling also proved QAC's efficacy. The KZ exponent for Quantum Annealing Correction (with a penalty strength of 0.1) was $\mu_{QAC} = 5.7 \pm 0.10$, significantly lower than C3's $\mu_{C3} = 7.79 \pm 0.26$, indicating suppression of diabatic excitations. QAC reduces diabatic errors and J-chaos, enhancing TT$\epsilon$ and reducing optimal annealing times by promoting adiabatic dynamics at equal annealing durations.
Using up to 1,322 logical qubits in an error-corrected context, this remarkable demonstration advances quantum advantage. Tunnelling is a likely cause of the claimed speedup, although more investigation is needed. This shows how quantum annealing can solve optimisation challenges in many industries that were previously intractable.
Beyond finite-range and two-dimensional problem families, quantum optimisation must apply this hardware-scalable advantage to densely coupled challenges and achieve efficacy at increasingly smaller optimality gaps. This groundbreaking study shows how crucial sophisticated error correction and suppression strategies are to maximise the potential of current and future quantum annealing technology.
#QuantumAnnealingCorrection#quantumannealing#logicalqubits#DWave#dataqubits#News#Technews#Technology#Technologynews#Technologytrends#Govindhtech
0 notes
Text
What Is NISQ Era, It’s Characteristics And Applications

The Noisy Intermediate-Scale Quantum (NISQ) period, coined by physicist John Preskill in 2018 to describe quantum computers then and now, describes quantum computing. Although these devices may conduct quantum processes, noise and errors limit their capabilities.
Describe NISQ Era
NISQ devices typically have tens to several hundred qubits, although some have up to 1,000. Atom Computing's 1,180-qubit quantum processor reached 1,000 qubits in October 2023. IBM's Condor has over 1,000 qubits, although sub-1,000 CPUs are still common in 2024.
Characteristics
Key features of NISQ systems include:
Qubits' quantum states last for a short time.
Noisy Operations: Hardware and environmental noise can create quantum gate and measurement errors. Quantum decoherence and environmental sensing affect these computers.
Due to a paucity of quantum error correction resources, NISQ devices cannot continuously discover and repair errors during circuit execution.
Hybrid algorithms: NISQ methods often use classical computers to compute and compensate for quantum device constraints.
Situation and Challenges
Situation and Challenges Even though quantum computing has moved beyond labs, commercially available quantum computers have substantial error rates. Due to this intrinsic fallibility, some analysts expect a ‘quantum winter’ for the business, while others believe technological issues will constrain the sector for decades. Despite advances, NISQ machines are typically no better than traditional computers at broad problems.
NISQ technology has several drawbacks:
Error Accumulation: Rapid error accumulation limits quantum circuit depth and complexity.
Limited Algorithmic Applications: NISQ devices cannot provide fully error-corrected qubits, which many quantum algorithms require.
Scalability Issues: Increasing qubits without compromising quality is tough.
Costly and Complex: NISQ device construction and maintenance require cryogenic systems and other infrastructure.
It is unclear if NISQ computers can provide a demonstrable quantum advantage over the finest conventional algorithms in real-world applications. In general, quantum supremacy experiments like Google's 53-qubit Sycamore processor in 2019 have focused on problems difficult for conventional computers but without immediate practical applicability.
Developments
New innovations and exciting uses Despite challenges, progress is being made. Current research focusses on qubit performance, noise reduction, and error correction. Google proved Quantum Error Correction (QEC) is practical and theoretical. Microsoft researchers reported a dramatic decrease in error rates utilising four logical qubits in April 2024, suggesting large-scale quantum computing may be viable sooner than thought.
Chris Coleman, a Condensed Matter Physicist and Quantum Insider consultant, says dynamic compilation strategies that make quantum systems easier to use and innovations in supporting systems like cryogenics, optics, and control and readout drive advancements.
Applications
NISQ devices enable helpful research and application exploration in several fields:
Quantum Chemistry and Materials Science: Simulating chemical processes and molecular structures to improve catalysis and drug development. Quandela innovates NISQ-era quantum computing by employing photonics to reduce noise and scale quantum systems.
The Quantum Approximate Optimisation Algorithm (QAOA) and Variational Quantum Eigensolver (VQE), which use hybrid quantum-classical methods, are designed for NISQ devices to produce practical results despite noise. Optimisation Issues: Manage supply chain, logistics, and finance.
Quantum Machine Learning: Using quantum technologies to process huge datasets and improve predictive analytics.
Simulation of quantum systems for basic research.
Although they cannot crack public-key encryption, NISQ devices are used to study post-quantum cryptography and quantum key distribution for secure communication.
Cloud platforms are making many quantum systems accessible, increasing basic research and helping early users find rapid benefits.
To Fault-Tolerant Quantum Computing
The NISQ period may bridge noisy systems and fault-tolerant quantum computers. The goal is to create error-corrected quantum computers that can solve increasingly complex problems. This change requires:
Improved Qubit Coherence and Quality: Longer coherence periods and reduced quantum gate error rates for more stable qubits. Improved Quantum Error Correction: Effective and scalable QEC code creation. For fault-tolerant quantum computers, millions of physical qubits should encode fewer logical qubits.
Having far more qubits than NISQ devices' tens to hundreds.
New Qubit Technologies: Studying topological qubits, used in Microsoft's Majorana 1 device and designed to be more error-resistant.
As researchers develop fault-tolerant systems, observers expect the NISQ period to persist for years. Early fault-tolerant machines may exhibit scientific quantum advantage in the coming years, with comprehensive fault-tolerant quantum computing expected in the late 2020s to 2030s.
In conclusion, NISQ computing is a complicated industry with challenging difficulties to overcome, but it is also a rapidly evolving stage driven by a dedicated community of academics and commercial specialists. Advancements lay the groundwork for quantum technology's revolutionary potential and the future.
#NoisyIntermediateScaleQuantum#NISQEra#NISQ#FaultTolerant#Qubits#QuantumErrorCorrection#technology#technews#technologynews#news#govindhtech
0 notes
Text
Types of qubits And Applications of Quantum Processing Units

This article covers quantum processing unit applications, structure, qubit types, and more.
The “brain” of a quantum computer is a Quantum Processing Unit. This cutting-edge machine solves complicated issues with qubits and quantum physics. QPUs use qubits, which can be 0, 1, or a mix of both. Traditional computers employ binary bits. QPUs handle data differently than computers due to quantum principles like entanglement, decoherence, and interference.
QPU Structure and Function
Two key components make up a QPU:
Quantum Chip: This semiconductor base has numerous layers etched with superconducting components. These components make up physical qubits.
Control Electronics: These handle and amplify control signals, control and read qubits, and address decoherence-causing interference. They have standard CPU components for data exchange and instruction storing.
Dilution refrigerators that freeze the quantum chip to near absolute zero—colder than space—are needed for qubit coherence. Traditional computing equipment and control circuits can be stored in racks close to the refrigerator at normal temperature. The whole quantum computer system, including cryogenic systems and other classical components, may be the size of a four-door car.
Quantum logic gates in QPUs translate qubit data mathematically, unlike binary logic gates. Even though they can solve issues that classical computing cannot, QPUs are much slower than CPUs in raw computation speed. But other issue classes compute more efficiently, which can reduce calculation time.
Types of Qubit
Quantum processors’ quantum technologies vary, showing the variety of quantum computers in development. Qubits are usually made by manipulating quantum particles or building systems that approximate them. Different modalities include:
Cold, laser-controlled neutral atoms in vacuum chambers. Scaling and executing activities are their specialities.
Low-temperature superconducting qubits are preferred for speed and precise control. IBM QPUs employ solid-state superconducting qubits.
High-fidelity measurements and long coherence durations are possible with trapped ion qubits.
Catching an electron creates a qubit from quantum dots, tiny semiconductors. Compatible with semiconductor technology and scalable.
Photons: Light particles used in quantum communication and cryptography, notably long-distance quantum information transfer.
QPU manufacturer design direction and computing requirements often determine qubit modality. All known qubits require a lot of hardware and software for noise handling and calibration due to their extraordinary sensitivity.
Quantum Processing Unit Applications
QPUs promise advances in many vital industries and are ideally suited for unsolved problems. Important uses include:
Combinatorial optimisation challenges: These enormous issues get tougher to tackle. Neutral atom Rydberg states may solve these classification problems.
Pharmaceuticals and quantum chemistry: Accelerating medication development and chemical byproduct studies by simulating molecular and biological reactions.
Artificial Intelligence (AI) and Machine Learning (ML): Quantum algorithms may speed up Machine Learning and help AI investigate new techniques by analysing enormous volumes of classical data.
Materials Science: Studying physical matter to solve problems in solar energy, energy storage, and lighter aviation materials.
Integer factorisation can still undermine open cryptosystems.
AI and cybersecurity applications are commercialising RNG.
In quantum cryptography, new cryptographic algorithms are developed to improve data security.
Simulation of complex quantum particle systems to predict their behaviour before physical design.
Present and Future Availability
QPU development is accelerating in 2025 due to traditional computing demands. Tech giants D-Wave Systems, Google, IBM, Intel, IQM, Nvidia, QuEra, Pasqal, and Rigetti Computing are developing QPUs. IBM has achieved “quantum utility” (reliable, accurate outputs beyond brute-force classical simulations) and is pursuing “quantum advantage” (outperforming classical supercomputing).
However, serious challenges remain. Early QPUs have low qubit coherence and significant error rates due to noise. Scalability constraints limit useful uses. Software tools for building, testing, and debugging quantum algorithms can also be improved.
Commercial QPUs are appearing, but they may take time to become generally available. QPUs will likely be used only by government labs and large public cloud companies that offer quantum computing as a service due to their environmental requirements, which include powerful refrigeration systems, vacuums, and electromagnetic protection to chill qubits close to absolute zero. The QPUs’ specialised computing skills are not needed, hence they should not be integrated into cellphones or home PCs.
#QuantumChip#QuantumProcessingUnit#qubits#physicalqubits#quantumcomputing#superconductingqubits#NeuralProcessingUnit#CentralProcessingUnit#News#Technews#Technology#Technologynews#Technologytrends#Govindhtech
0 notes
Text
Qoro Quantum And CESGA For Distributed Quantum Simulation

Qoro Quantum
Qoro Quantum and CESGA represent distributed quantum circuits with high-performance computing. Using Qoro Quantum's orchestration software and CESGA's CUNQA emulator, a test study showed scalable, distributed quantum circuit simulations over 10 HPC nodes. To assess distributed VQE and QAOA implementations, Qoro's Divi software built and scheduled thousands of quantum circuits for simulation on CESGA's infrastructure.
VQE and QAOA workloads finished in less than a second, demonstrating that high-throughput quantum algorithm simulations may be done with little code and efficient resources.
The pilot proved that distributed emulators like CUNQA can prepare HPC systems for large-scale quantum computing deployments by validating hybrid quantum-classical operations.
A pilot research from the Galician Supercomputing Centre (CESGA) and Qoro Quantum reveals how high-performance computing platforms may facilitate scalable, distributed quantum circuit simulations. A Qoro Quantum release said the two-week collaboration involved implementing Qoro's middleware orchestration platform to execute distributed versions of the variational quantum eigensolver and quantum approximate optimisation algorithm across CESGA's QMIO infrastructure.
Quantum Workload Integration and HPC Systems
Qoro's Divi quantum application layer automates hybrid quantum-classical algorithm orchestration and parallelisation. Divi created and ran quantum workloads on 10 HPC nodes using CESGA's CUNQA distributed QPU simulation framework for the pilot.
The announcement states that CESGA's modular testbed CUNQA mimics distributed QPU settings with customisable topologies and noise models. Qoro's technology might simulate quantum workloads in a multi-node setup to meet the demands of emerging hybrid quantum-HPC systems.
Everything worked perfectly, communication went well, and end-to-end functionality worked as intended.
Comparing QAOA and VQE in Distributed HPC
The variational hybrid approach VQE is used to estimate the ground-state energy of quantum systems, a major problem in quantum chemistry. Qoro and CESGA modelled a hydrogen molecule using two ansätze Hartree-Fock and Unitary Coupled Cluster Singles and Doubles in this pilot. Divi made 6,000 VQE circuits based on 20 bond length values.
With 10 computational nodes, the CUNQA emulator investigated the ansatz parameter space via Monte Carlo optimisation. Qoro says it replicated full demand in 0.51 seconds. Data collected automatically and returned for analysis show that the platform can enable high-throughput testing with only 15 lines of Divi code.
The researchers also evaluated QAOA, a quantum-classical technique for Max-Cut and combinatorial optimisation. This data clustering, circuit design, and logistics challenge involves partitioning a graph to maximise edges between two subgroups.
A 150-node network was partitioned into 15 clusters for simulation, and Qoro's Divi software built Monte Carlo parameterised circuits.Tests included 21,375 circuits in 15.44 seconds and 2,850 circuits in 2.13 seconds. The quantum-classical cut size ratio grew from 0.51 to 0.65 with sample size. The CUNQA emulator ran all circuits in parallel again utilising CESGA's architecture.
Performance, Infrastructure, and Prospects
Several pilot research results demonstrate scalable hybrid quantum computing advances. According to the Qoro Quantum release, Qoro's orchestration platform and CESGA's distributed quantum emulator provided faultless communication between the simulated QPU infrastructure and application layer. The cooperation also demonstrated how Qoro's Divi software could automatically generate and plan enormous quantum workloads, simplifying complex quantum applications.
The experiment also shown that distributed execution of hybrid quantum-classical algorithms over several HPC nodes may enhance performance without much human setup. Finally, the pilot showed key technological elements for scaling quantum workloads in high-performance computing. These insights will inform future distributed quantum system design.
Simulating distributed quantum architectures shows how HPC infrastructure might manage future quantum workloads. Qoro Quantum and CESGA plan to improve this method to enable quantum computing in large classical contexts.
CUNQA is being established as part of Quantum Spain with EU and Spanish Ministry for Digital Transformation support. ERDF_REACT EU funded this project's QMIO infrastructure for COVID-19 response.
#QoroQuantum#QuantumQoro#QAOA#CESGA#quantumcircuit#CUNQA#technology#TechNews#technologynews#news#govindhtech
0 notes
Text
Jones Polynomial: Quantum Hardware Performance Evaluation

A Tangled Benchmark: Jones Polynomial Quantum Hardware Scale Testing.
Quantum field theory and Jones polynomial
Quantinuum researchers devised an end-to-end quantum algorithm to estimate the Jones polynomial on the H2-2 quantum computer using hardware-aware optimisations and error-mitigation approaches.
The technique's Fibonacci braid representation and concentration on a DQC1-complete knot theory issue may provide it a quantum edge over more generic BQP formulations.
The researchers generated topologically identical braids with known polynomial values to provide a verifiable baseline for error analysis across noise models and circuit sizes.
Their findings suggest that quantum techniques may outperform classical methods for cases with over 2,800 braid crossings and gate fidelities above 99.99%.
Find a quantum edge where? Despite technology's constant change, engineers return to some issues. Instead of trying to fit the answer into a quantum machine, we may save time and effort by seeking for issues naturally connected to quantum physics and finding areas where quantum is more likely to be discovered.
A good example is topology. Quantinuum's worldview emphasises creating quantum research in physics-native, demonstrable topics. As opted to display progress step-by-step and not signal an untestable future.
We have lead the field in gate fidelities (across all zones), logical qubits, the first topological qubit, modelling the Icing model, validated RCS, and the first quantum processor beyond classical simulation during the last year. New Jones polynomials work contrasts theoretical complexity with hardware readiness, maintaining this trend.
In a new arXiv paper, Quantinuum researchers offer an end-to-end quantum method for calculating the Jones polynomial of knots, a knot theory root issue and putative quantum advantage discovery site. Real-world implementation of a quantum-native issue on Quantinuum's H2-2 quantum computer shows hardware-specific optimisations and algorithmic advances. The authors say this is more than a benchmark since it provides a framework for carefully finding and assessing near-term quantum advantage.
Knot Invariants to Quantum Circuits
Jones polynomials are topological invariants that assign polynomials to knots or links without deformation. Traditional approaches are computationally expensive, especially for knots with hundreds or thousands of crossings.
It has deep theoretical roots. Over twenty years ago, approximating the Jones polynomial at specified roots of unity was shown to be complete for complexity classes like BQP (bounded-error quantum polynomial time) and DQC1 (deterministic quantum computing with one clean qubit). In other words, quantum circuits are suited for this task. The study team says that the DQC1 version, based on Markov-closed braids, is advantageous since it requires “less quantum” resources but is harder for classical algorithms.
The Quantinuum technique implements both DQC1- and BQP-complete versions utilising the Fibonacci representation of braiding, a model that is roughly universal for quantum computing, using the fifth root of unity as the evaluation point.
Fully Compiled, Hardware-Optimized Pipeline
Using hardware-aware methods, the authors avoid generic circuit templates. A control-free, echo-verified Hadamard test is employed in their implementation. This optimised version reduces shot noise and two-qubit gates, the major source of error on most systems. The quantum circuit simulates a braid of three-qubit gates working on Fibonacci strings, selecting base states.
The team uses pairs of topologically connected circuits to avoid systematic phase shifts to solve coherence and phase defects with the "conjugate trick." They also use Fibonacci subspace structure to discover errors that delete samples that differ from predicted measurement symmetries.
Together, these optimisations allow researchers to scale up problem instances on NISQ devices beyond what was previously thought possible. With 4,000 shots per circuit and demonstrable error mitigation improvements, they analysed a 16-qubit, 340 two-qubit gate circuit representing a knot with 104 crossings in one demonstration.
Built-In Verification Benchmarking
An easily verifiable benchmark was a highlight of the endeavour. Since the Jones polynomial is link invariant, any two topologically similar braids must provide the same result. The researchers produced topologically identical braids with varied depths and diameters and compared the quantum and classical output to a specified value. This lets you study error scaling in connection to noise model, gate depth, and circuit size.
Less Quantum, More Benefit
The paper's fascinating title, “Less Quantum, More Advantage,” suggests a shift. Instead than chasing quantum advantage in the most powerful or generic forms, the team tackles a theoretically meaningful and classically tough problem that can be addressed with moderate quantum resources. They believe that the DQC1 version of the Jones polynomial yields better results than BQP while being less expressive.
A new Nature article on the study's “mind-blowing” knot theory-quantum physics relationship supports this perspective. Konstantinos Meichanetzidis of Quantinuum, who worked on this new study, highlighted how knot invariants may be employed as computational objectives and intrinsic accuracy checks in quantum hardware. Two circuits with the same knot representations indicate the algorithm is working.
As quantum computing moves beyond toy problems and hand-picked examples, verified and classically demanding benchmarks are needed. The paper asserts that the Jones polynomial is unique in its theoretical complexity, real-world application, and quantum architectural compatibility.
Instead of claiming supremacy today, the authors present a scholarly and transparent evaluation of when, how, and under what conditions a quantum algorithm may surpass traditional techniques. This is a substantial contribution that increases our knowledge of practical quantum advantage.
#technology#technews#govindhtech#news#technologynews#quantum computing#Jones Polynomial#jones#DQC1#Quantum Circuits#Less Quantum#Quantinuum#Topology
1 note
·
View note