Eth2 Crawler is Ethereum blockchain project that extracts eth2 node information from the network save it to the datastore. It also exposes a graphQL interface to access the information saved in the datastore.
There are three main components in the project:
- Crawler: crawls the network for eth2 nodes, extract additional information about the node and save it to the datastore
- MongoDB: datastore to save eth2 nodes information
- GraphQL Interface: provide access the stored information
- docker
- docker-compose
Before building, please make sure environment variables RESOLVER_API_KEY(which is used to fetch information about node using IP) is setup properly. You can get your key from IP data dashboard. To setup the variable, create an .env in the same folder as of docker-compose.yaml
Example .env File
RESOLVER_API_KEY=your_ip_data_keyEth2 crawler support config through yaml files. Default yaml config is provided at cmd/config/config.dev.yaml. You can use your own config file by providing it's path using the -p flag
We use docker-compose for testing locally. Once you have defined the environment variable in the .env file, you can start the server using:
make runmake run- run the crawler servicemake lint- run lintermake test- runs the test casesmake license- add license to the missing filesmake license-check- checks for missing license headers
See the LICENSE file for license rights and limitations (lgpl-3.0).