Keyword Management System (KMS) is a application for maintaining keywords (science keywords, platforms, instruments, data centers, locations, projects, services, resolution, etc.) in the earthdata/IDN system.
To install the necessary components, run:
npm install
Prerequisites:
- Docker
- aws-sam-cli (
brew install aws-sam-cli
)
To start local server (including rdf4j database server, cdk synth and sam)
npm run start-local
To run the test suite, run:
npm run test
In order to run KMS locally, you first need to setup a RDF database.
export RDF4J_USER_NAME=[user name]
export RDF4J_PASSWORD=[password]
npm run rdf4j:build
npm run create-network
npm run rdf4j:start
npm run rdf4j:setup
npm run rdf4j:stop
export AWS_ACCESS_KEY_ID=${your access key id}
export AWS_SECRET_ACCESS_KEY=${your secret access key}
export AWS_SESSION_TOKEN=${your session token}
export VPC_ID={your vpc id}
export RDF4J_USER_NAME=[your rdfdb user name]
export RDF4J_PASSWORD=[your rdfdb password]
export RDF4J_CONTAINER_MEMORY_LIMIT=[7168 for sit|uat, 14336 for prod]
cd cdk/rdfdb/bin
./deploy_to_ecr.sh
export RDF4J_USER_NAME=[your rdfdb user name]
export RDF4J_PASSWORD=[your rdfdb password]
export RDF4J_CONTAINER_MEMORY_LIMIT=[7168 for sit|uat, 14336 for prod]
export RDF4J_INSTANCE_TYPE=["M5.LARGE" for sit|uat, "R5.LARGE" for prod]
cd cdk
cdk deploy rdf4jIamStack
cdk deploy rdf4jEbsStack
cdk deploy rdf4jLbStack
cdk deploy rdf4jEcsStack
cdk deploy rdf4jSnapshotStack
cdk deploy KmsStack
cd cdk
cdk deploy --all
One thing to note is if you destroy the rdf4jEbsStack and redeploy, this will create a new EBS file system. You will need to copy the data from the old EBS file system to the new one. This can be done by mounting the old EBS file system to an EC2 instance and copying the data to the new EBS file system.
export bamboo_STAGE_NAME=[sit|uat|prod]
export bamboo_AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
export bamboo_AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
export bamboo_AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
export bamboo_LAMBDA_TIMEOUT=30
export bamboo_SUBNET_ID_A={subnet #1}
export bamboo_SUBNET_ID_B={subnet #2}
export bamboo_SUBNET_ID_C={subnet #3}
export bamboo_VPC_ID={your vpc id}
export bamboo_RDF4J_USER_NAME=[your rdfdb user name]
export bamboo_RDF4J_PASSWORD=[your rdfdb password]
export bamboo_EDL_HOST=[edl host name]
export bamboo_EDL_UID=[edl user id]
export bamboo_EDL_PASSWORD=[edl password]
export bamboo_CMR_BASE_URL=[cmr base url]
export bamboo_CORS_ORIGIN=[comma separated list of cors origins]
export bamboo_RDF4J_CONTAINER_MEMORY_LIMIT=[7168 for sit|uat, 14336 for prod]
export bamboo_RDF4J_INSTANCE_TYPE=["M5.LARGE" for sit|uat, "R5.LARGE" for prod]
export bamboo_RDF_BUCKET_NAME=[name of bucket for storing archived versions]
export bamboo_EXISTING_API_ID=[api id if deploying this into an existing api gateway]
export bamboo_ROOT_RESOURCE_ID=[see CDK_MIGRATION.md for how to determine]
./bin/deploy-bamboo.sh