Connecting Quantum computing to Web3. A tech approach.

No-code quantum algorithms execution with the QCentroid Platform

In this article we’re going to go through the requirements, the challenges and the solutions adopted to connect the Ethereum blockchain to the QCentroid Quantum Solutions Platform.

Due to the nature of blockchain ecosystems, accessing off-chain data from on-chain smart contracts is not natively possible. However, oracles like Chainlink provide this bridge between on-chain and off-chain data. Oracles enable smart contracts to retrieve data from the outside world.


The QCentroid Platform is accessible through an authenticated REST API, just like almost any other platform out there.

To access our platform from the blockchain, we decided to use ChainLink oracle platform.

Due to the nature of quantum computing, the QCentroid platform works in an asynchronous mode, which means that the result of a computation request is not returned right away in the response. Instead, a Job Id is returned to be used later to check the job status and to obtain the result.

This asynchronous behavior adds a layer of complexity to the whole workflow from the smart contract, through the oracle, to the Quantum Platform and back.

But, before we enter in detail into this workflow, let’s see the architecture that we have deployed and all the components that we need.

This diagram shows all the components involved:

  • The user’s smart contract
  • The QCentroid Quantum provider smart contract
  • ChainLink Oracle
  • A ChainLink node
  • The External Adapters
  • The QCentroid Platform
  • ChainLink keepers

Depending on your application, you will not always need all these components. In our case, I would like to highlight four of these components and why we decided to deploy, run and operate them ourselves.

The QCentroid provider smart contract simplifies the access to the oracle to the final users. A final user smart contract only needs a call to this provider contract and a callback function  instead of extending the Chainlink client contract and the use of LINK tokens, oracle address and job ids. This provider smart contract manages all this.

If you need to use authentication to access an API you are going to need an External Adapter. This may already exist if you’re accessing a well known API, such as GitHub or Twitter, but if you are trying to connect to your own API or to a not so common API, you’ll have to build your own adapter.

Becoming a ChainLink node operator is not a trivial task, it’s a job itself. In our case, our own ChainLink node is needed to control the access to the Platform API. As the credentials to access the API are managed at the External Adapter, now we need a mechanism to control the access to this external adapter. The way to do this in the ChainLink platform is through address whitelisting at node level. This is why we decided to run and operate our own ChainLink node to be able to manage the list of approved addresses.

ChainLink Keepers provide decentralized and highly reliable smart contract automation. Due to the asynchronous nature of the QCentroid Quantum Platform we make use of ChainLink Keepers to poll the status of the ongoing jobs and fetch the result whenever it is ready.

These four components may or may not be needed by your application, but now you know the role they play in the ecosystem and you can decide whether you need them or not. 

For most of these components you’ll find tons of information and tutorials on how to build and deploy them. Here are the resources that we used at QCentroid to build our ecosystem.

To start building our own External Adapter, we used the Chainlink NodeJS External Adapter Template by Thomas as a starting point. This adapter is written in NodeJS and allows you to run it locally, as a Docker service or serverless (using AWS Lambda for example).
The API-specific values are configured as environment variables, so they are not hardcoded and they can be easily filled by your CI/CD workflow.

Building and running your own ChainLink Node for development purposes is also a relatively simple task relying on the resources shared with us by ChainLink, like the ChainLink SmartContract Kit. There you’ll find step-by-step instructions on how to build and run a ChainLink node. Setting up the node for operational purposes is where things start to get complicated and would be subject for another article. 


Now that we know what all these elements are, what they are for and how to build and run them, let’s have a look at a full workflow, from the initial request from a user’s smart contract to the reception of the solution from the Platform.

This workflow shows the different components described earlier and how and when they are involved in the process and they’re role.

It’s important to mention that our main objective when designing this architecture was to simplify as much as possible the work needed by our smart contract end users. The diagram clearly shows how few “arrows” are needed on the left and how our users only need to focus on their business. As the workflow moves to the right, we can see how the components involved in the architecture handle the tedious tasks such as authentication or polling for the results.

For our end users, using Quantum computing from a smart contract is as easy as: “here is my data, here is your result”.

With this article, we wanted to show our specific use case of a hybrid on-chain/off-chain architecture and how we’ve approached the connection between these two worlds using ChainLink’s breakthrough oracle technology.

Effortlessly integrate Quantum Hardware, execute Quantum Algorithms, and compare results with our advanced tools.

  • Easily upload and execute quantum algorithms on a wide range of quantum hardware.
  • Seamless integration with existing IT systems through APIs, SDKs and Smart contracts.
  • Comprehensive monitoring and analysis tools to optimize algorithm performance and cost.
  • Pay-as-you-go pricing model for flexibility and cost-effectiveness.