#04 - Building a MERN stack app
Part 1 - Setup
2023-01-15
Hello reader,
Today I will be helping you along with what should be a fairly straightforward tutorial to get a basic MERN stack setup, front and back.
You can think of this as a sort of template that you can use while still getting familiar with key concepts. I personally use this template to quickly build projects without all the setup grind, but having repeated this process so many times, I am fairly familiar with all the key concepts and syntax.
So let’s begin coding our Book Store using a MERN Stack!
SETUP
Open up your terminal and navigate to the directory that you want to save your app.
mkdir NAME-OF-YOUR-APP-HERE cd NAME-OF-YOUR-APP lang-bash
Inside of your root file, we are going to create the client and the server.
There are many configurations that can be utilised, but for me, it allows me to see the server and client as two separate places where their related files are consolidated. This also helps if you are expecting to use different web services to host the front and back end.
FRONTEND
FRONTEND
So let’s create the frontend using React
npx create-react-app client lang-bash
This will install a react app titled “client” within your root directory
Simple.
When it is completed, you can check if your installation of react is working as expected
cd client npm start lang-bash
npm start will launch a local server on your machine as well as open up a new tab where you should be able to see the React landing page.
BACKEND
The backend will be built using node.js (to create the backend server and initialise connections), express.js (to handle routing) and MongoDB( our online database which we will setup later).
So let’s create our server directory by navigating back into our root file.
.. mkdir server cd server lang-bash
Inside server, we are going to create this directory structure
-Root --client --server ----models ------users.js ------books.js ----routes ------routes.js ------auth.js ----utils ------cloudinary.js ----index.js ----package.json ----package-lock.json ----.env lang-bash
Before you go and start creating this directory structure in your editor of choice, let us first initialise our node app.
npm init lang-bash
This will create a package.json file. This file will list all the dependancies and run time scripts that build the application.
But we need to install a few things to our backend to get started.
npm i bcrypt body-parser cloudinary cors dotenv express file-type jsonwebtoken mongoose next nodemon lang-bash
First off, the “i” stands for install. Let’s take a quick look at what all these dependancies are:
bcrypt: A library used for password hashing and encryption. It provides a secure way to store and compare passwords by applying one-way hash functions.
body-parser: Middleware for parsing incoming request bodies in Node.js. It extracts data from request payloads and makes it accessible in the req.body object.
cloudinary: A cloud-based service for managing and manipulating media assets (images, videos, etc.). It provides features like image uploading, storage, transformations, and delivery through a simple API.
cors: Cross-Origin Resource Sharing (CORS) is a mechanism that allows web browsers to make cross-origin HTTP requests. The cors library provides middleware to enable CORS in a Node.js application, allowing controlled access to resources from different origins.
dotenv: A library for loading environment variables from a .env file into Node.js applications. It simplifies the process of managing configuration settings by providing a convenient way to store sensitive information separate from the codebase.
file-type: A library for detecting the file type based on its content or extension. It helps in identifying the MIME type of a file, which can be useful for validation, processing, or handling different file formats.
jsonwebtoken: A library for creating and verifying JSON Web Tokens (JWT). It enables secure communication and authentication by generating tokens that contain encrypted data and can be digitally signed.
mongoose: An Object-Data Mapping (ODM) library for MongoDB and Node.js. It simplifies database interactions by providing a schema-based approach for modeling data and a rich set of features for querying, validating, and manipulating MongoDB documents.
next: A framework for building server-rendered React applications. It provides features like server-side rendering, routing, and code splitting, making it easier to develop high-performance and SEO-friendly React applications.
nodemon: A tool for automatically restarting a Node.js application whenever changes are detected in the project files. It improves the development workflow by eliminating the need to manually restart the server after every code modification.
By understanding these dependancies, we now have a decent idea of what our backend is meant to do. Now that these dependancies have been installed, you will additionally have a package-lock.json file.
You can now create the rest of the directory shown above for the server. Either through terminal or directly in your IDE, create the various folders and js files.
Once you’re done, let’s get a quick server running by starting with our index.js file
index.js
const express = require('express') const app = express() const PORT = process.env.PORT || 4000 app.listen(PORT, () => console.log(`server is running on ${PORT}`)) lang-javascript
Run npm start to see if your server is running.
Before we can get our server started, we will need to add our start script to our package.json file.
Make sure your package.json file looks like this.
Also, notice the start script which we will be using in our terminal to start our server.
package.json
{ "name": "server", "version": "1.0.0", "description": "", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "nodemon index.js" }, "author": "", "license": "ISC", "dependencies": { "bcrypt": "^5.1.0", "body-parser": "^1.20.2", "cloudinary": "^1.37.0", "cors": "^2.8.5", "dotenv": "^16.1.4", "express": "^4.18.2", "file-type": "^18.5.0", "jsonwebtoken": "^9.0.0", "mongoose": "^7.2.2", "next": "^13.4.4", "nodemon": "^2.0.22" } } lang-json
Then
npm start // ---> server is running on 4000 lang-bash
If you saw that message, then great, everything is working. We’re gonna update that file with the rest of what we need for the rest of the backend section of the project.
index.js
index.js
const express = require('express') const app = express() const PORT = process.env.PORT || 4000 const mongoose = require('mongoose') const dotenv = require('dotenv') const routesUrls = require('./routes/routes') const cors = require('cors') const bodyParser = require('body-parser') dotenv.config() mongoose.connect(process.env.DATABASE_ACCESS) .then(() => { console.log("DB connected"); }) .catch((error) => { console.error("Error connecting to MongoDB:", error); }); const corsOptions = { origin: ['https://name-of-your.app', 'http://localhost:3000'], credentials: true, }; app.use(bodyParser.urlencoded({ extended: false })); app.use(bodyParser.json()); app.use(express.json()) app.use(cors(corsOptions)) app.use('/', routesUrls) app.listen(PORT, () => console.log(`server is running on ${PORT}`)) lang-javascript
Now let’s create our basic routes.
We will be creating both User routes, Book routes and Auth routes.
routes.js
//for simplicity we will put book and user routes in the same file though it is //normal to seperate these routes into their own book-routes.js and user-routes.js //files const express = require('express') const routes = express.Router() const newUserTemplateCopy = require('../models/users') const newBookTemplateCopy = require('../models/books') const Books = require('../models/books') const Users = require('../models/users') const cloudinary = require('cloudinary') const bcrypt = require("bcrypt"); const jwt = require("jsonwebtoken"); const auth = require("./auth"); // Index Routes routes.get('/', (req, res) => { res.send('Hello world'); }) // User Routes routes.post('/signup', (req, res) =>{ bcrypt .hash(req.body.password, 10) .then((hashedPassword) => { const user = new newUserTemplateCopy({ name: req.body.name, surname: req.body.surname, email: req.body.email, password: hashedPassword, imageUrl: req.body.imageUrl, public_id: req.body.publicId }) user .save() .then((result) => { res.status(201).send({ message: "User Created Successfully", result, }); }) .catch((error) => { res.status(500).send({ message: "Error creating user", error, }); }); }) .catch((e) => { res.status(500).send({ message: "Password was not hashed successfully", e, }) }) }) routes.post('/login', (req, res) => { console.log("login route triggered") Users.findOne({ email: req.body.email }) .then((user) => { console.log("user object:",user) bcrypt .compare(req.body.password, user.password) .then((passwordCheck) => { console.log("password check object:", passwordCheck) if(!passwordCheck ) { console.log( "No password provided") } const token = jwt.sign( { userId: user._id, userEmail: user.email, }, "RANDOM-TOKEN", { expiresIn: "24h" } ) res.status(200).send({ message: "Login Successful", email: user.email, userId: user._id, token, }) }) .catch((error) => { res.status(400).send({ message: "Passwords do not match", error, }); }); }) .catch((e) => { res.status(404).send({ message: "Email not found", e, }) }) }) routes.get('/user/show/:id', (req, res) => { const userId = req.params.id; console.log("GET SINGLE USER RECORD:", userId) Users.findOne({_id: userId}) .then(data => res.json(data)) }) routes.put('/user/update/:id', auth, (req, res) => { const userId = req.params.id console.log("update user id route", userId) Users.updateOne({_id: userId}, { name: req.body.name, surname: req.body.surname, email: req.body.email, imageUrl: req.body.imageUrl, public_id: req.body.publicId }) .then(data => res.json(data)) }) routes.delete('/user/delete/:id', (req, res) => { const userId = req.params.id console.log(userId,":delete route") Users.deleteOne({_id: userId}, function (err, _result) { if (err) { res.status(400).send(`Error deleting listing with id ${userId}!`); } else { console.log(`${userId} document deleted`); } }) cloudinary.config({ cloud_name: process.env.CLOUD_NAME, api_key: process.env.CLOUD_API_KEY, api_secret: process.env.CLOUD_API_SECRET }) const publicId = req.params.public_id console.log("cloudinary check public_id for delete:", publicId) cloudinary.v2.uploader .destroy(publicId) .then(result=>console.log("cloudinary delete", result)) .catch(_err=> console.log("Something went wrong, please try again later.")) }) // Book Routes // Cloudinary routes.post('/book/upload', (req,res) => { }) routes.post('/book/add', (req, res) =>{ const newBook = new newBookTemplateCopy({ title:req.body.title, description:req.body.description, imageUrl: req.body.imageUrl, public_id: req.body.publicId, user: req.body.user }) newBook.save() .then(data =>{ res.json(data) console.log("Send request successful:", data) }) .catch(error => { res.json(error) console.log("Send request failed", error) }) }) routes.get('/books/show/:id', (req, res) => { const bookId = req.params.id console.log("GET SINGLE RECORD:", bookId) Books.findOne({_id: bookId}) .then(data => res.json(data)) }) routes.get('/books', (req, res) => { Books.find() .then(data => res.json(data)) }) routes.put('/book/update/:id',auth, (req, res) => { const bookId = req.params.id console.log(bookId, "update book id route") Books.updateOne({_id: bookId}, { title:req.body.title, description:req.body.description, imageUrl: req.body.imageUrl, public_id: req.body.publicId }) .then(data => res.json(data)) }) routes.delete('/book/delete/:id/:public_id/user/:user_id', auth, async (req, res) => { try { const bookId = req.params.id; const book = await Books.findById(bookId); if (!book) { return res.status(404).json({ msg: 'Book not found' }); } const bookUser = book.user.toString(); const loggedInUser = req.params.user_id; console.log("do these numbers match?:", bookUser ,":", loggedInUser) // Check if the user is allowed to delete the book if (bookUser !== loggedInUser) { return res.status(401).json({ msg: 'Not authorized to delete this book' }); } await Books.deleteOne(book); cloudinary.config({ cloud_name: process.env.CLOUD_NAME, api_key: process.env.CLOUD_API_KEY, api_secret: process.env.CLOUD_API_SECRET }) const publicId = req.params.public_id console.log("cloudinary check public_id for delete:", publicId) cloudinary.v2.uploader .destroy(publicId) .then(result => console.log("cloudinary delete", result)) .catch(err => console.log("Something went wrong, please try again later.", err)) res.json({ msg: 'Book deleted' }); } catch (err) { console.error(err.message); res.status(500).send('Server Error'); } }); module.exports = routes lang-javascript
You’ve probably figured out what these routes are doing. They are handling your basic CRUD methods. This file will also depend on your .env file to deliver sensitive access details to your app without exposing them to the web. We will work on that a little bit later.
You are likely confused about how authorisation on these routes is being handled. That bring us to our next routes file that we will call auth.js in our routes folder.
auth.js
const jwt = require("jsonwebtoken"); module.exports = async (req, res, next) => { try { const token = req.headers.authorization; const decodedToken = jwt.verify(token, "RANDOM-TOKEN"); const user = decodedToken; req.user = user; next(); } catch (error) { res.status(401).json({ error: new Error("Invalid request!"), }); } }; lang-javascript
This code allows us to verify if a user is logged in by verifying if the random-token , which is later decoded, is the verified token for a specific user. This way, we can assure that only officially logged in users can create, delete or update their own posts.
Great, let’s move onto our User and Book models. Your files should look like this
users.js
const mongoose = require('mongoose') const newUserTemplate = new mongoose.Schema ({ name:{ type:String, required:true, }, surname:{ type:String, required:true }, email:{ type:String, required:false }, password:{ type:String, required:false }, imageUrl: { type: String, required:true } , public_id: { type: String, required: true } }) module.exports = mongoose.model('usertable', newUserTemplate ) lang-javascript
books.js
const mongoose = require('mongoose') const newBookTemplate = new mongoose.Schema ({ title:{ type:String, required:true, }, description:{ type:String, required:true }, id:{ type:mongoose.Types.ObjectId, required:false } , imageUrl: { type: String, required:true } , public_id: { type: String, required: true }, user: { type: mongoose.Schema.Types.ObjectId, ref: 'User', } }) module.exports = mongoose.model('booktable', newBookTemplate ) lang-javascript
Now let’s complete that utils folder.
Cloudinary will be our CDN for media etc.
So
cloudinary.js
const cloudinary = require('cloudinary') cloudinary.config({ cloud_name: process.env.CLOUD_NAME, api_key: process.env.AWS_ACCESS_KEY_ID, api_secret: process.env.AWS_SECRET_ACCESS_KEY }); exports.module = cloudinaryCongfig; lang-javascript
At this point, most of your backend app (server) is setup except for two things.
We need a mongoDB account and cloudinary account.
From here we will get the necessary api keys to be able to use their services along with our app.
MongoDB & Cloudinary
We have already installed the mongoose dependancy, which is a js library that will handle our connection to our MongoDB. We have done the same for cloudinary. But both of these libraries will require us to input those api keys in order for them to work successfully.
— video tutorial here: https://youtu.be/7CqJlxBYj-M?t=294
Once you have complete the setup and retrieved your api keys etc, go to your .env file (which you should create within the root of your server directory) and populate it in the below format.
.env
DATABASE_ACCESS=put-your-mongodb-access-url-here CLOUDINARY_URL=put-your-cloudinary-access-url-here CLOUD_NAME=put-your-cloudinary-name-here CLOUD_API_KEY=put-your-cloudinary-api-key-here CLOUD_API_SECRET=put-your-cloudinary-api-secret-here lang-bash
If you got the correct details, your backend app is just about setup and ready to receive requests from the frontend.
Let's see if everything is up and running.
npm start lang-bash
That's it for part 1!
Expect a little update to this post soon as well as part 2 coming up next.
See you soon,
Ilia
Expect a little update to this post soon as well as part 2 coming up next.
See you soon,
Ilia