Introducing Juno — The Cosmos parser

Introducing Juno — The Cosmos parser

At the beginning of 2019, the FissionLabs team first published Juno, a tool which purpose was to allow to parse any Cosmos SDK-based chain and store its data inside a PostgreSQL database that can later be accessed using a GraphQL endpoint.

Since then, we at Forbole have been observing that project very closely. We have always thought that accessing the chain data directly from a full node would not be always feasible, in particular when the data needs to be returned in a fast and reliable way.

This problem was even clearer when we started working on mobile clients for Desmos, our own Cosmos-based chain. To make sure such clients were able to perform custom queries on the chain data, we had to have a middle layer in between such clients and the chain itself that allowed such operations. This were when we first decided to fork the original Juno and start expanding it.

Today we are proud to announce the first stable version of our own Juno fork which brings a lot of improvements to the original codebase.

Support for Cosmos Stargate

The major change is the support for the Cosmos “Stargate” release series. As this series is supposed to be one of the most used versions of the Cosmos SDK from now onward, we wanted to make sure Juno works properly with it. We have updated all the dependencies to make sure that Juno now uses Protobuf instead of Amino, and the new gRPC client instead of the old REST APIs.

Custom module support

Another thing that we achieved with Juno is the ability to use this project as a library. This means that if you want to build your own custom parser, you can do this by simply including Juno as a dependency into your project. We have made possible to customize each and every part of it, so that you can easily support your custom module and perform the desired operations when parsing its data, from genesis up to its messages.

You can see a couple of examples of using Juno as a library to build your own custom parser inside our DJuno and BDJuno repositories.

Pruning

Since Juno is supposed to run without intermission, the data it stores can become quite large very quickly. For this reason, we have also added the ability to enable the pruning of such data. If used, this feature will periodically clean the data stored inside your database to make sure it does not grow out of your control.

Performance improvements

Finally, we have tried to do our best to improve the overall performances of the original Juno code. We have done everything to make sure that whenever possible the code runs on multiple goroutines so that it can parse data faster and more memory-efficiently. This has made it possible for us to create a parser that is able to deal with high load transactions networks as well.

Conclusions

We are very happy with what we have achieved with Juno. However, we will not stop here. We will try to constantly improve the code to make sure it’s stable enough to work with any kind of network, and it supports new Cosmos features as soon as they become available.

We would like to thank the ICF for the grant they gave us to bring on the development of this project and improve it until here. We really hope to see this project being used to create a series of services that make it easier to query on-chain data without having to rely on full nodes only.

Introducing Juno — The Cosmos parser

Introducing Juno — The Cosmos parser

At the beginning of 2019, the FissionLabs team first published Juno, a tool which purpose was to allow to parse any Cosmos SDK-based chain and store its data inside a PostgreSQL database that can later be accessed using a GraphQL endpoint.

Since then, we at Forbole have been observing that project very closely. We have always thought that accessing the chain data directly from a full node would not be always feasible, in particular when the data needs to be returned in a fast and reliable way.

This problem was even clearer when we started working on mobile clients for Desmos, our own Cosmos-based chain. To make sure such clients were able to perform custom queries on the chain data, we had to have a middle layer in between such clients and the chain itself that allowed such operations. This were when we first decided to fork the original Juno and start expanding it.

Today we are proud to announce the first stable version of our own Juno fork which brings a lot of improvements to the original codebase.

Support for Cosmos Stargate

The major change is the support for the Cosmos “Stargate” release series. As this series is supposed to be one of the most used versions of the Cosmos SDK from now onward, we wanted to make sure Juno works properly with it. We have updated all the dependencies to make sure that Juno now uses Protobuf instead of Amino, and the new gRPC client instead of the old REST APIs.

Custom module support

Another thing that we achieved with Juno is the ability to use this project as a library. This means that if you want to build your own custom parser, you can do this by simply including Juno as a dependency into your project. We have made possible to customize each and every part of it, so that you can easily support your custom module and perform the desired operations when parsing its data, from genesis up to its messages.

You can see a couple of examples of using Juno as a library to build your own custom parser inside our DJuno and BDJuno repositories.

Pruning

Since Juno is supposed to run without intermission, the data it stores can become quite large very quickly. For this reason, we have also added the ability to enable the pruning of such data. If used, this feature will periodically clean the data stored inside your database to make sure it does not grow out of your control.

Performance improvements

Finally, we have tried to do our best to improve the overall performances of the original Juno code. We have done everything to make sure that whenever possible the code runs on multiple goroutines so that it can parse data faster and more memory-efficiently. This has made it possible for us to create a parser that is able to deal with high load transactions networks as well.

Conclusions

We are very happy with what we have achieved with Juno. However, we will not stop here. We will try to constantly improve the code to make sure it’s stable enough to work with any kind of network, and it supports new Cosmos features as soon as they become available.

We would like to thank the ICF for the grant they gave us to bring on the development of this project and improve it until here. We really hope to see this project being used to create a series of services that make it easier to query on-chain data without having to rely on full nodes only.