A thin wrapper around an vector-store-enriched conversation with an LLM chatbot.
See the infrastructure README for details.
- Populate
.envwith local environment variables. - Run
npm installinfrontend/(usemake setup), thennpm run devto start a development server. - Run
docker-compose upin root
We use Cypress.
- Populate
frontend/.envwith Cognito credentials. - Start frontend and backend servers.
- Run
npm run cypress:openinfrontend/.
- Create a local virtualenv from
backend/requirements.txtandbackend/requirements-dev.txt. - Run
pytest(orptw backend/for continuouspytests)