Care about software quality? See our QA services.

See also

Let’s discuss your project

“Deep learning is going to be able to do everything. It’s going to be better than any human at any intellectual task.”

Geoffrey Hinton, Interview with MIT Technology Review | Source

Have questions or need support? Contact us – our experts are happy to help.


In an era of data explosion and the need to make business decisions faster and faster, traditional methods of analyzing information are reaching their limits. Quantum Machine Learning (QML) appears to be a promising technology that could change the way organizations process and analyze data in the future. Combining the principles of quantum mechanics with machine learning algorithms, QML offers the theoretical basis for breakthroughs in computing speed, but it is important to remember that the technology is still in its early stages and its practical business applications may be 5-10 years away.

This article provides a realistic look at the potential of Quantum Machine Learning in the context of upcoming next-generation networks (5G/6G) and its possible impact on enterprise competitiveness. For organizations planning long-term technology strategies, understanding both the opportunities and limitations of this technology is key to making informed IT investment decisions.

What is Quantum Machine Learning and how does it work?

Quantum Machine Learning is a field of research that combines quantum mechanics with machine learning algorithms. Unlike classical computers that operate on bits (0 or 1), quantum computers use qubits, which can exist in a superposition state - representing both 0 and 1 at the same time. This fundamental difference creates a theoretical basis for parallel data processing, which for specific problems could lead to significant acceleration of computation.

However, it should be made clear that the current state of quantum technology is far from commercial maturity. State-of-the-art quantum computers today have only 100-200 qubits, are highly unstable and require extreme operating conditions (temperatures near absolute zero). QML algorithms, such as quantum SVMs and quantum neural networks, are currently mainly theoretical concepts whose full implementation on real quantum systems remains a challenge for years to come.

A key technical challenge is so-called quantum decoherence - the phenomenon of losing quantum states due to interactions with the environment. This causes even state-of-the-art quantum computers to be able to maintain stable quantum states for only milliseconds, drastically limiting the complexity of calculations that can be performed. The technology is now primarily the domain of academic research and experimental projects at major technology companies and government laboratories.

Key facts about the current state of QML

  • Most QML algorithms remain in the theoretical or laboratory stage

  • Current quantum computers have 100-200 qubits (IBM, Google) versus the thousands needed for practical applications

  • The coherence time is milliseconds, which limits possible calculations

  • Access to real quantum computers possible mainly through the cloud (IBM Quantum, AWS Braket)

  • Cost of building and maintaining a quantum system: $10-15 million per year

Why is Quantum Machine Learning revolutionizing data analysis?

The potential revolutionary nature of Quantum Machine Learning lies in its theoretical ability to solve problems previously considered computationally unsolvable. However, it is important to distinguish between the theoretical possibilities and current technological realities. While quantum algorithms, such as Shor’s algorithm or Grover’s algorithm, offer mathematical evidence of exponential acceleration for specific problems, their practical implementation faces fundamental hardware difficulties.

It is also worth noting that not all data analysis problems will gain from the use of quantum computers. The benefits will appear mainly in specific classes of problems, such as combinatorial optimization, quantum simulations or factorization of large numbers. Many standard machine learning tasks, such as simple classifications or regressions, may not show a significant computational advantage on quantum computers compared to advanced classical gas pedals (GPU, TPU, FPGA).

A realistic assessment of the business impact of QML must also include an analysis of cost and return on investment. Current estimates indicate that building and maintaining a dedicated quantum computer is a cost in the range of $10-15 million per year, not including the cost of software and specialized person

el. For most organizations, a more cost-effective approach will be to use cloud services (Quantum as a Service) in the next 5-7 years, and the real return on such investments may not appear until the 10+ year horizon for highly specialized applications.

What are the key differences between Quantum Machine Learning and traditional machine learning?

The most significant difference between Quantum Machine Learning and traditional machine learning lies in the fundamental mechanisms of information processing. While in theory QML offers exponential increases in computing power for specific problems, in practice there are a number of alternative classical technologies that already achieve impressive results without the need for quantum computers.

It is worth noting that we are currently seeing rapid development of specialized ASICs, neuromorphic processors and AI gas pedals (such as NVIDIA H100, Google TPU v4, and Intel Gaudi), which offer tremendous acceleration for machine learning algorithms. Likewise, techniques such as federated learning, edge computing and neural computing allow for efficient analysis of huge data sets using technology that is already available. Over the next 3-5 years, these classical approaches may prove to be much more cost-effective than experimental quantum solutions.

Failures and delays in quantum projects are also an important factor inhibiting QML adoption. IBM has revised its timetables for achieving “quantum supremacy” for practical applications several times. Similarly, Google, despite its high-profile a

ouncement of “quantum supremacy” in 2019, has failed to turn this lab success into commercial applications. Many quantum startups (like Rigetti Computing) have experienced significant valuation declines after failing to deliver the promised breakthroughs. These experiences indicate that the road to practical implementations of QML may be longer and bumpier than enthusiastic predictions suggest.

Comparison: Traditional ML vs Quantum ML

Traditional ML (currently available):

  • Mature technology with a broad ecosystem of tools

  • Specialized gas pedals (GPU/TPU) offering 10-100x acceleratio

  • Proven effectiveness in real-world business applications

  • Predictable ROI over a 1-3 year horizo

Quantum ML (5-10 year outlook):

  • Early stage of development, mainly basic research

  • Theoretical acceleration for specific problems

  • Requires fundamental hardware breakthroughs

  • Uncertain ROI, high investment risk

How do quantum algorithms accelerate the processing of large data sets?

The theoretical acceleration offered by quantum algorithms is impressive, but requires critical analysis. Although mathematical models indicate the possibility of exponential acceleration for some problems, realizing these benefits faces significant practical obstacles. In real-world implementations, the total computation time must take into account not only the execution of the quantum algorithm itself, but also the preparation of the data, the transfer to the quantum system, and the conversion of the results back to the classical format.

An example of the discrepancy between theory and practice is the famous Grover algorithm, theoretically offering a quadratic speedup in searching unsorted databases. Although in theory this algorithm can find an element in a database of one million records in about 1,000 steps (instead of 500,000 in the classical approach), current implementations are limited to a few dozen elements due to quantum noise and decoherence. Similarly, the HHL (Harrow-Hassidim-Lloyd) algorithm for solving systems of linear equations offers theoretical exponential speedup, but requires perfect cubits and perfect quantum control - conditions unattainable in current systems.

It is also important to point out that there are alternative classical approaches that can compete with quantum algorithms for many business applications. Heuristic methods, approximation algorithms and random sampling techniques often offer good enough solutions to optimization problems at a fraction of the cost. Distributed computing and task-specific architectures (such as neuromorphic computing) can provide significant acceleration for many data analysis applications without waiting for mature quantum technologies.

How will the synergy of Quantum Machine Learning with 5G/6G networks affect business?

Combining Quantum Machine Learning technology with next-generation 5G and future 6G networks offers theoretically powerful capabilities, but requires a realistic assessment of deployment schedules. 5G networks are already being deployed globally, reaching speeds of up to 20 Gbps and latency of less than 1 ms. However, full commercialization of practical QML applications may not occur until the 6G network time horizon (2028-2030), which implies the need for long-term strategic planning.

Cost and return on investment (ROI) analysis is key when evaluating potential deployments. Building infrastructure to connect QML to 5G/6G networks requires significant capital expenditures, estimated in the tens of millions of dollars for a single enterprise. For example, early pilot projects using QML at financial institutions are generating costs in the range of $5-10 million per year, with operating profits still uncertain. Given these costs, a realistic return on investment may not occur until 5-7 years after the project begins, requiring a long-term financial commitment from the organization.

It is worth noting that the implementation of QML solutions carries technological and business risks. Previous quantum projects have been characterized by significant delays and schedule revisions. For example, IBM’s Quantum program planned to reach 1,000 qubits by 2023, but the deadline was pushed back to 2025 and then revised in favor of a smaller number of higher-quality qubits. These experiences indicate that the business must be prepared to flexibly adjust its QML strategy in response to actual technological advances, rather than relying solely on optimistic forecasts from technology providers.

A realistic timeline of QML and 5G/6G synergies

  • Current (2025): Mainly basic research, simulation and pilot projects

  • Short term (2026-2028): Hybrid solutions combining classical HPC with quantum elements

  • Medium term (2029-2031): First practical applications of QML in highly specialized domains

  • Long term (2032+): Potential wider adoption assuming technological breakthroughs

  • Estimated cost of implementation: $5-20 million (depending on scale)

  • Expected ROI: 5-7 years for pioneering implementations

What specific business problems can Quantum Machine Learning solve in the era of ultrafast networks?

Quantum Machine Learning offers potential solutions to highly complex business problems, but it is necessary to be realistic about the implementation possibilities. In logistics and supply chain management, algorithms such as QAOA (Quantum Approximate Optimization Algorithm) can theoretically improve route and resource optimization. However, it should be noted that currently available specialized classical systems, such as Gurobi or CPLEX, already offer advanced optimization capabilities that meet current business needs. For example, Maersk has invested in solutions based on classical optimization algorithms, achieving a 15% reduction in logistics costs without the need for quantum technologies.

In assessing the application potential of QML, it is also worth considering the failures of pilot projects. In the financial sector, despite numerous a

ouncements of breakthroughs in risk modeling, practical results remain limited. Goldman Sachs, despite several years of investment in research into quantum algorithms for pricing derivatives, has failed to implement these solutions into daily operations due to the immaturity of the technology. Similarly, Volkswagen launched an experiment in 2019 to optimize city traffic in Lisbon using the D-Wave quantum computer, but the solution has not moved from the pilot phase to regular implementation.

A critical analysis of historical QML projects indicates that the most promising near-term applications may lie in the field of quantum simulations for the pharmaceutical and materials industries. Companies such as Merck and BASF are already using quantum algorithms to simulate molecular structures, which could translate into accelerated discovery of new chemical compounds. Even in these cases, however, the technology remains in the experimental phase, and realistic estimates sugger that the first commercial drugs discovered with significant input from quantum computers may not appear until 2030-2035.

Why will integrating Quantum Machine Learning with 6G enable real-time data analysis?

The theoretical synergy between Quantum Machine Learning and 6G networks presents a vision of breakthrough analytical capabilities, but it needs to be set in a realistic time and technology framework. 6G networks, with a projected bandwidth of 1 Tbps and latency of less than 0.1 ms, are still in the early research phase, with anticipated commercialization no sooner than 2028-2030. At the same time, the maturity of quantum technologies allowing practical business applications of QML may occur in a similar time horizon.

This temporal convergence in the development of the two technologies creates investment dilemmas for organizations. On the one hand, early experimentation can provide a competitive advantage. On the other hand, investing too early in immature technologies carries a high financial risk. An analysis of failed quantum projects shows that pioneers often encounter technical difficulties beyond initial estimates. An example is the startup Rigetti Computing, which modified its quantum technology development plan several times, leading to a significant loss of market value and disappointed investors.

The ethical and geopolitical aspects of integrating QML with 6G networks are also important. These technologies could exacerbate existing digital inequalities between organizations and countries. Access to advanced quantum systems is already concentrated in the hands of the largest technology corporations and a few of the wealthiest countries, raising questions about equality of opportunity in the global digital economy. Additionally, quantum technologies have become part of geopolitical rivalries, as evidenced by national programs such as the Chinese Quantum Initiatives ($16 billion), the EU Quantum Flagship (€1 billion) and the US National Quantum Initiative ($1.3 billion). This rivalry can lead to fragmentation of standards and restrictions on international technology transfer.

What technological challenges are holding back the commercialization of Quantum Machine Learning today?

There are a number of fundamental technological challenges facing the commercialization of Quantum Machine Learning, beyond the oft-cited problems of decoherence and a limited number of qubits. A serious but less frequently discussed challenge is the issue of quantum errors and their correction. Even the most advanced quantum systems have high error rates (about 0.5-1% per quantum operation), which drastically limits the complexity of feasible algorithms. Theoretical solutions in the form of quantum error correction codes require a significant overhead - up to 1,000 physical qubits may be needed for one logical error-tolerant qubit, pushing back the prospect of practical QML applications by years.

The history of quantum technology commercialization is replete with examples of failure and over-optimism. D-Wave Systems, the company that pioneered quantum computers based on quantum a

ealing, has for years promised breakthroughs in optimization, but independent studies have shown that their systems do not offer clear advantages over classical algorithms for real business problems. IBM has repeatedly revised its schedules for the number of qubits and achieving practical quantum advantage. These examples indicate that companies should be cautious about overly optimistic predictions by quantum technology providers.

The availability of qualified persoel is also a challenge. Specialists combining expertise in quantum physics and machine learning are extremely rare in the labor market. According to the McKinsey 2023 report, globally there are fewer than 5,000 people with the right skills to run advanced QML projects, which translates into high persoel costs (the average a

ual salary for a QML specialist is $250-350k in the US). This skills gap is a significant barrier for organizations considering implementing QML and must be factored into a realistic assessment of the potential of these technologies.

A realistic assessment of QML’s technological barriers

  • Quantum errors: Error rate of 0.5-1% per operation, requiring advanced correctio

  • Scaling of logical cubits: Need ~1000 physical cubits per 1 logical cubit

  • Engineering problems: Extreme environmental requirements (temperature, insulation)

  • Talent gap: Global <5000 professionals combining quantum and ML expertise

  • Unfulfilled promises: Multiple revisions of schedules by industry leaders

  • Development costs: An average of $5-10 million per year for advanced research projects

How will Quantum Machine Learning affect cybersecurity in 5G infrastructure?

Quantum Machine Learning’s impact on the cybersecurity of 5G infrastructure has both defensive and offensive aspects, but a realistic view of the time scale of these developments is needed. Quantum algorithms theoretically offer new capabilities for detecting anomalies and advanced threats, although current quantum systems are too limited to realize these applications in practice. A more immediate and urgent threat is the potential of the Shor algorithm to crack modern cryptographic systems, although experts estimate that practical implementation of this algorithm capable of cracking RSA-2048 requires a quantum computer with at least 4,000-8,000 stable logic cubits, which may not happen before 2030-2035.

The prospect of quantum cipher cracking has already triggered significant changes in security approaches. NIST (National Institute of Standards and Technology) in 2022 began the process of standardizing post-quantum algorithms that will be resistant to attacks using quantum computers. This standardization process will take a few more years, and then implementation of the new standards will take another few years. This creates a potential window of vulnerability - infrastructure secured by today’s methods could theoretically be at risk in the future, when quantum computers reach adequate computing power.

Geopolitical analysis indicates that quantum technologies are becoming part of the rivalry between superpowers, which may affect cyber security issues. China, the US and the EU are investing billions of dollars in the development of quantum technologies, motivated in part by potential national security applications. This rivalry could lead to fragmentation of standards and regulatory approaches, further complicating global efforts to secure digital infrastructure. Organizations must factor these geopolitical tensions into their long-term security strategies, especially if they operate in international markets.

Which industries will be the fastest to implement Quantum Machine Learning solutions using next-generation networks?

Assessing QML’s deployment potential in specific industries requires a realistic analysis of both benefits and costs. The financial sector is often identified as a leader in potential implementations due to its high technology budgets and clearly defined use cases, such as portfolio optimization. However, the results of pilot projects to date have been mixed. JP Morgan Chase has been investing in quantum research since 2017, but despite hiring leading experts and partnering with IBM, the company has yet to deploy any quantum algorithm into daily business operations. Similarly, Goldman Sachs, despite publishing promising research results on quantum portfolio optimization, still bases its actual investment decisions on classical algorithms.

The pharmaceutical industry represents a potentially more promising sector for early applications, but there, too, the timetable for deployments is more distant than enthusiastic predictions suggest. Merck has been working with D-Wave since 2015 on the use of quantum computers for molecular simulations, but so far no breakthrough drug discovery made with significant input from the technology has been a

ounced. According to realistic industry estimates, the first drugs whose discovery will significantly aid QML may not reach clinical trials before 2028-2032.

It is also important to consider the ethical implications of the rapid deployment of QML across industries. These technologies may exacerbate existing inequalities between organizations with the capital to invest in quantum innovation and smaller players. In the healthcare sector, unequal access to advanced diagnostic capabilities backed by QML can lead to widening health inequalities. These ethical considerations should be taken into account when planning regulations and public policies that support equitable access to the benefits of new technologies.

Realistic assessment of QML deployments by industry

  • Pharmacy: Molecular simulations (2028-2032) - high cost, long payback period

  • Finance: portfolio optimization and risk management (2027-2030) - the problem of integratability with existing systems

  • Logistics: supply chain optimization (2029-2032) - competition from advanced classical solutions

  • Energy: grid optimization (2030+) - high regulatory and safety requirements

  • Telecommunications: Network quality analysis (2026-2029) - pioneering hybrid applications

  • Automotive: Materials simulation (2028+) - long implementation cycles of new technologies

How to prepare a company’s IT infrastructure for the deployment of quantum technologies in the context of 5G/6G?

Preparing IT infrastructure for the quantum-5G/6G era requires a balanced approach that considers both potential benefits and realistic costs. Instead of a complete infrastructure transformation, a phased approach with clear steps and decision points is more practical. The first step should be a cryptographic vulnerability audit - identifying systems that use algorithms compromised by quantum computers (RSA, ECC) and developing a plan to migrate to post-quantum cryptography according to NIST guidelines. This action has a real business case today, due to the “harvest now, decrypt later” threat, where attackers can collect encrypted data with the intention of decrypting it in the future.

Cost-benefit analysis (ROI) is crucial when planning investments in quantum technologies. For example, building a dedicated quantum technology team (3-5 specialists) is a cost of $1-2 million per year, not including access to infrastructure. A more economical solution for most organizations will be a hybrid model - maintaining a small research team (1-2 people) working with external quantum technology providers and consultants. An example of an effective approach is the model adopted by Airbus, which has established a small in-house quantum team while establishing strategic partnerships with IBM Quantum and universities to explore opportunities without incurring the full cost of technology development.

A practical approach is also to create a “quantum-ready” architecture instead of making immediate, expensive investments. This means designing IT systems with future integration with quantum technologies in mind, but without prematurely deploying immature solutions. For example, implementing APIs and abstraction layers that can communicate with quantum processors in the future, while using classical algorithms as a base solution. Such an architecture makes it possible to gradually incorporate quantum components as they mature, without having to completely overhaul the systems.

What investments in team competencies are key to leveraging Quantum Machine Learning?

Building competencies in the area of Quantum Machine Learning requires a strategic approach, taking into account both current labor market constraints and the long-term prospects for the technology. It is worth noting that the global competency gap in this area is significant - according to the Quantum Economic Development Consortium’s 2023 report, there are fewer than 5,000 specialists combining advanced knowledge in quantum mechanics and machine learning available on the global labor market. This limited supply of talent translates into high recruiting costs (salary packages of $250-400k per year for experienced professionals) and difficulties in building teams.

Instead of trying to build extensive quantum teams from scratch, a more realistic and cost-effective approach is a hybrid model. In practice, it works well to create a small, in-house team (1-3 specialists) responsible for exploring quantum technologies and coordinating collaboration with external partners. This team should be supplemented by existing data science experts who will receive additional training in the basics of quantum computing. JP Morgan Chase has taken just such an approach, maintaining a small, dedicated quantum team working with a broader group of data scientists and researchers, allowing the exploration of QML capabilities without the full cost of an expanded team of specialists.

It is also worth noting that there have been a number of failed initiatives in building quantum competence. Some organizations have invested significant resources in expansive teams, which then could not implement practical projects due to the immaturity of the technology. An example is one European investment bank, which created a 15-person quantum computing team in 2019 and then reduced it to three people in 2022 after a series of projects failed to deliver the expected business results. These experiences suggest that a more balanced approach, with clearly defined milestones and decision points, can be more effective.

A practical strategy for developing quantum competence

  • Start with a small team: 1-3 specialists instead of large persoel investments

  • Invest in training: Build quantum competencies among existing data scientists

  • External partners: Work with technology providers and consultants

  • Establish realistic milestones: Define clear decision points for further investment

  • Build management awareness: Educate decision makers about real opportunities and constraints

  • Monitor the progress of technology: Regularly review market developments and adjust your strategy

  • Expected cost: $300-500 thousand per year (small initiatives) to $1-2 million (advanced projects)

What regulations may shape the development of Quantum Machine Learning in the context of 6G networks?

Regulations for quantum technologies and 6G networks are currently in the formative stages, creating a complex regulatory environment for organizations planning long-term strategies. Internationally, we are seeing a progressive fragmentation of regulatory approaches. The United States, through the CHIPS and Science Act of 2022 and the National Quantum Initiative, is taking an approach that combines significant support for research with export restrictions. The European Union, through its Quantum Flagship program, is placing greater emphasis on ethical and social aspects, integrating quantum technologies into existing regulatory frameworks such as GDPR/RODO. China has taken the most centralized approach, treating quantum technologies as a strategic national priority with direct state oversight.

This geopolitical fragmentation creates significant challenges for organizations operating globally. Multinational corporations may encounter conflicting legal requirements in different jurisdictions, particularly in areas such as data protection, security standards or export restrictions. Quantum technologies, for example, are increasingly being classified as “dual-use” (dual technology), subject to export restrictions. In 2023, the U.S. extended restrictions on the export of advanced quantum technologies to China, affecting international research and corporate projects.

A particularly important regulatory area is data protection in the context of QML’s potential capabilities. Quantum algorithms could theoretically break through some data anonymization methods, raising questions about the adequacy of current regulations, such as RODO in Europe and the CCPA in California. In response to these challenges, the European Data Protection Board (EDPB) has begun developing new guidelines for the use of quantum technologies in the context of personal data processing in 2023. Organizations should follow these regulatory developments and participate in public consultations to ensure that future regulations both protect citizens’ rights and enable responsible innovation.

How are technology companies such as ARDURA supporting the adaptation of Quantum Machine Learning in business?

The role of technology companies in adopting Quantum Machine Learning requires a balanced approach that takes into account both the technology’s potential and its current limitations. ARDURA Consulting and similar consulting firms are moving away from promoting QML as an immediate solution, focusing instead on responsible consulting and building long-term transformation strategies. A key element of this approach is an honest assessment of the Technology Readiness Level (TRL) of various aspects of QML and transparent communication of both opportunities and limitations.

Analysis of quantum project failures provides valuable lessons for consulting firms. Many early quantum initiatives failed due to overly optimistic assumptions about the maturity of the technology or insufficient integration with existing business processes. For example, a pilot project to use QML for supply chain optimization at one European manufacturing company was suspended after two years, when it turned out that classical heuristic algorithms provided comparable results at significantly lower implementation costs. Responsible consulting firms, such as ARDURA Consulting, are taking this experience into account by proposing a phased approach with clear checkpoints and success criteria.

The ethical aspects of QML consulting are also worth emphasizing. Technology companies have a responsibility to promote an honest and responsible approach to new technologies. This includes transparently communicating potential investment risks, the uncertainty of commercialization timelines, and the geopolitical and social implications of quantum technologies. ARDURA and similar consulting firms should also consider issues of accessibility and equal opportunity - that is, how to ensure that the benefits of quantum technologies are not limited to only the largest and wealthiest organizations, but contribute to broader socio-economic development.