Welcome to your Basic Node API Repository
. Use this to start your own Greenfield Project using nodejs, express and common industry standards.
This repository assumes a handful of industry practices and standards. We strive to keep you on the bleeding edge of the industry and as a result, we have made some opinions for you so that you don't have to; you're welcome.
Read more at https://docs.labs.lambdaschool.com/labs-api-strarter/
Labs teams must follow all Labs Engineering Standards.
PORT
- API port (optional, but helpful with FE running as well)- The following ports are whitelisted for use with okta
- 3000
- 8000
- 8080
- The following ports are whitelisted for use with okta
DS_API_URL
- URL to a data science api. (eg. https://ds-bw-test.herokuapp.com/)DS_API_TOKEN
- authorization header token for data science api (eg. SUPERSECRET)DATABASE_URL
- connection string for postgres databaseOKTA_URL_ISSUER
- The complete issuer URL for verifying okta access tokens.https://auth.lambdalabs.dev/oauth2/default
OKTA_CLIENT_ID
- the okta client ID.TEST_DATABASE_URL
- the URL of the testing databaseOKTA_ORG_URL
- the URL of your Okta organizationNODE_ENV
- The environment to use for knex scripts, should be development on local and production on live serverAWS_ACCESS_KEY_ID
- A special access key from AWSAWS_SECRET_ACCESS_KEY
- A secret AWS access key that should not be shared with anyoneCI_DATABASE_URL
- the URL of the web-hosted database (I used ElephantSQL) that your CI tests are ran on. This must be set in order to use theknex:ci
script in thepackage.json
!S3_BUCKET
- the name of the S3 bucket used for file storage
See .env.sample for example values
There are 3 options to get postgresql installed locally [Choose one]:
- Use docker. Install for your platform
- run:
docker-compose up -d
to start up the postgresql database and pgadmin. - Open a browser to pgadmin and you should see the Dev server already defined.
- If you need to start over you will need to delete the folder
$ rm -rf ./data/pg
as this is where all of the server data is stored.- if the database
api-dev
was not created then start over.
- if the database
When using Docker, you will need to manually create your test database, called
api-test
- run:
- Download and install postgresql directly from the main site
- make note of the port, username and password you use to setup the database.
- Connect your client to the server manually using the values previously mentioned
- You will need to create a database manually using a client.
- Make sure to update the DATABASE_URL connection string with the values for username/password, databasename and server port (if not 5432).
Make sure you create
api-dev
to run any queries in Postman or through the swagger documentation, as well as a database calledapi-test
to run the Jest tests - Setup a free account at ElephantSQL
- Sign up for a free
Tiney Turtle
plan - copy the URL to the DATABASE_URL .env variable
- make sure to add
?ssl=true
to the end of this url
you'll need separate databases for Jest testing (
api-test
) and Postman/Swagger endpoint testing (api-dev
) - Sign up for a free
- create your project repo by forking or using this as a template.
- run:
npm install
to download all dependencies. - run:
cp .env.sample .env
and update the enviornment variables to match your local setup. - run:
npm run knex migrate:latest
to create the starting schema. - run:
npm run knex seed:run
to populate your db with some data. - run:
npm run tests
to confirm all is setup and tests pass. - run:
npm run watch:dev
to start nodemon in local dev enviornment.
Make sure to update the details of the app name, description and version in the
package.json
andconfig/jsdoc.js
files.
Be sure to read this: As of 3/22/21, this project is set up to have 3 databases. When working locally, we have our local DB instance which only affects our local environment. Second, we have our "developer-mode" database (this DB updates according to the 'staging' branch). Altering this DB via queries etc changes it for all other developers as well, but will not influence the production database. This means we can utilize it for testing bugs, simulating games, etc. Third, we will have our production database. The workflow for this project differs from some other projects; for the back-end repository we will be making PRs and merging to the 'staging' branch before we merge to main. This does two things: first, allows us to keep our developer database up-to-date since it auto-deploys on changes to 'staging,' and second, gives us the opportunity to test changes in the staging DB before sending the changes to our main back-end. It is crucial to keep this workflow, since the developer-mode database will need to be functioning identically to our production database, allowing us to test bugs, simulate game stages, etc.
For information about developer mode, view the front end repository's documentation.
- Walkthrough video explaining some of the features can be seen here
- View a detailed explanation of the S3 upload middleware here
- The S3 upload middleware was designed to play nicely with the frontend UploadDocs component
- Many of the database queries use
knex.transaction()
, something we didn't see in our time at Lambda - Endpoint request/response data is thoroughly documented using swagger, so once the server is running make sure to check out
localhost:8000/api-docs
to see all server capabilities