Blog
- Details
Sometimes you may have multiple code repositories which you always end up checking out together, and deploying together. Or maybe it seemed like a good organization idea to separate your code based on functionality, but in practice, it has become cumbersome.
With a few commands, you can merge the multiple repositories into one repository, keeping their history. You can also keep the separate repositories in their own sub directory, thus maintaining the organization of your code, but utilize one repository to facility development, branches, and deployments.
Example list of current separate repositories:
ls
/local_git/old_project_1
/local_git/old_project_2
Update your local repositories with the latest code
Pull and Commit/Push any changes
Create a new repository for the combined project and push
cd /local_git
mkdir new_combined_project
touch README.md
git init .
git commit -m "add readme"
git remote add origin remote https://github.com/your_repo_url
git push
Add the first separate repository to the new combined repository
cd /local_git/new_combined_project
git remote add old_project_1 ../old_project_1
List repositories
git remote -v
old_project_1 ../old_project_1.git (fetch)
old_project_1 ../old_project_1.git (push)
origin https://github.com/your_repo_url.git (fetch)
origin https://github.com/your_repo_url.git (push)
Note: If you get the wrong path to your local repository, you can remove repository entries
git remote remove old_project_1
Fetch the branch/tags/master for the first separate repository
git fetch old_project_1 --tags
Merge the files and histories for the first separate repository
git merge --allow-unrelated-histories old_project_1/master
list of files
You should have a list of files and directories from the first separate repository
ls
your files from old_project_1
Optionally create a sub directory to move the files into.
cd /local_git/new_combined_project
mkdir old_project_1
Move the files and folders into the new nested directory
git mv !(old_project_1|old_project_2) old_project_1
Note: If you just mv or cut/paste the files into the new directory, git may not persist the history for those files.
Note: !() excludes the listed directory/file
Note: If you get an error about unknown bash command !, enable the glob extension
shopt -s extglob
Your directories/files from the first separate repository should now be in
/local_git/new_combined_project/old_project_1
and you should have the git history for old_project_1
Consolidated commands repeating for the second separate repository
cd /local_git/new_combined_project
ls
git remote add old_project_2 ../old_project_2
git remote -v
git fetch old_project_2 --tags
git merge --allow-unrelated-histories old_project_2/master
mkdir old_project_2
git mv !(old_project_1|old_project_2) old_project_2
ls
ls old_project_2
Your multiple separate repositories are now merged into one repository, with their history. After verifying by checking out to a new directory, viewing it's history, you can remove the prior separate repositories.
Reference: https://stackoverflow.com/a/10548919
-End of Document-
Thanks for reading
- Details
Dotenv is a zero-dependency module that loads environment variables from a .env file into process.env. Storing configuration in the environment separate from code is based on The Twelve-Factor App methodology.
Reference: https://github.com/motdotla/dotenv
While you can use NodeJS Dotenv to manage configurations per environment,
another approach is to create a configuration per environment, and use an environment based file to distinguish which configuration to use.
Why?
It is useful to actually version control your configuration, so every developer and every instance of your application has the same configuration keys and values.
An example of how to setup a production and staging environment follows.
Create the environment file as root to minimize the odds of the environment being removed or changed
> cd /home/yuourapp/
> sudo touch env-prod
> cd /home/yuourappstg/
> sudo touch env-stg
Helper scripts to start and restart your NodeJS Forever service
Note: [ -f "env-stg" ] returns true if the file exists
> start-yourapp.sh
#!/bin/bash
if [ -f "env-stg" ]; then
forever start -a --minUptime 1000 --spinSleepTime 2000 --uid yourapp-stg yourapp.js
else
forever start -a --minUptime 1000 --spinSleepTime 2000 --uid yourapp yourapp.js
fi
> restart-yourapp.js
#!/bin/bash
if [ -f "env-stg" ]; then
forever restart yourapp-stg
else
forever restart yourapp
fi
Create a configuration file per environment, ensuring that each configuration has the same keys, and varying the values as appropriate.
> config/config-stg.js
module.exports = {
port : 9011,
log : {
console : { level : 'silly' }
}
};
> config/config-prod.js
module.exports = {
port : 9001,
log : {
console : { level : 'error' }
}
};
Create a base configuration script to read in the appropriate configuration file
An example for a NodeJS process
> config/config.js
const path = require('path');
const fs = require('fs');
let env;
// check for env-stg or env-prod file
if (fs.existsSync('env-stg')) {
env = 'stg';
} else {
env = 'prod';
}
const configPath = path.resolve(process.cwd(), `config/config-${env}`);
const config = require(configPath);
// visual validation of correct env
console.log('Using config ' + configPath);
module.exports = config;
An example for a VueJS/Nuxt process
Due to VueJS/Nuxt being a browser based solution,
to avoid warnings and errors, create another env file to be required
> sudo vi env
module.exports = {
env: 'stg'
}
Add configuration as needed
> nuxt.config-stg.js
module.exports = {
server: { port: 9013 },
};
> nuxt.config-prod.js
module.exports = {
server: { port: 9003 },
};
Create a base configuration script to read in the appropriate configuration file.
> nuxt.config.js
const path = require('path');
// check for env file
let env;
try {
// (base)/
// require ('./env-stg');
// using exports, as when require from .vue, build causes warning 'Module not found: Error: Can't resolve './env-stg''
env = require ('./env');
env = env.env;
} catch (e) {
// default prod
env = 'prod';
}
// check for env based nuxt config when called from different relative paths
let configPath;
let config;
try {
// (base)/
configPath = `./nuxt.config-${env}.js`;
// config = require(configPath); // on build, results in warning 'Critical dependency: the request of a dependency is an expression'
config = require(`./nuxt.config-${env}.js`);
} catch (e) {
try {
// (base)/server/
configPath = `../nuxt.config-${env}.js`;
config = require(`../nuxt.config-${env}.js`);
} catch (e) {
// (base)/pages/dir/
configPath = `../../nuxt.config-${env}.js`;
config = require(`../../nuxt.config-${env}.js`);
}
}
// visual validation of correct env
console.log('Building nuxt using ' + configPath);
module.exports = config;
Now you can check in the configuration files, but do not check in the .env and env-stg, env-prod files (add them to .gitignore), as those should vary based on the deployed environment.
-End of Document-
Thanks for reading
- Details
The purpose of NodeJS Forever is to keep a child process (such as your node.js web server) running continuously and automatically restart it when it exits unexpectedly. Forever basically allows you to run your NodeJS application as a process.
Reference: https://stackoverflow.com/a/32944853
A simple CLI tool for ensuring that a given script runs continuously (i.e. forever)
https://github.com/foreversd/forever#readme
A simple example to start and manage Forever
> forever start -a --minUptime 1000 --spinSleepTime 2000 --uid yourapp-stg yourapp.js
-a append to logs
--minUptime 1000 wait a second before considering restart
--spinSleepTime 2000 wait two seconds before restarting
--uid name the forever process
List all running Forever processes
> forever list
info: Forever processes running
data: uid command script forever pid id logfile uptime
data: [0] yourapp-stg /usr/bin/node start.js 1668 23197 /home/yourapp/.forever/yourapp.log 0:1:20:14.94
You can restart and stop by name or uid.
Note: uid is incremental, so it may not always be the same number.
> forever restart yourapp-stg
> forever restart 0
> forever stop yourapp-stg
> forever stop 0
And since you may not want to type or remember all these options, create some helper shell scripts
> start-yourapp.sh
#!/bin/bash
forever start -a --minUptime 1000 --spinSleepTime 2000 --uid yourapp-stg yourapp.js
> restart-yourapp.sh
#!/bin/bash
forever restart yourapp-stg
While forever will keep your NodeJS process running, it will not start on reboot.
One simple method to ensure your NodeJS runs on reboot, is to add crontab entry to your forever process.
Create a crontab entry as the user your app runs as
> crontab -e
@reboot /bin/sh /home/yourapp/crontab-reboot.sh
And create the reboot script
> crontab-reboot.sh
#!/bin/bash
# export path to NodeJS, Forever
export PATH=/usr/local/bin:$PATH
# cd to location of script
cd /home/yourapp || exit
# run script, in this case Forever
forever start -a --minUptime 1000 --spinSleepTime 2000 --uid yourapp start.js
So now you application will run .. Forever .. yup.
-End of Document-
Thanks for reading
- Details
MongoDB is a cross-platform document-oriented database program.
Classified as a NoSQL database program, MongoDB uses JSON-like documents with schema.
Source: Wikipedia
MongoDB 4.0 was released on 2018-08-06
New 'wow' features of MongoDB 4.0:
- multi-document ACID transactions
- data type conversions
- 40% faster shard migrations
- non-blocking secondary replica reads
And some other niceties:
- native visualizations with MongoDB Charts
- Compass aggregation pipeline builder
- Stitch serverless platform
- SHA-2 authentication
- Mobile database
- HIPAA compliance to the MongoDB Atlas database service
- free community monitoring service
- Kubernetes integration
While the new features in MongoDB 4.0 are great,
the latest Ubuntu version 18.04 official repository still installs MongoDB version 3.6
To get the current version of MongoDB
> mongo --version
To install MongoDB version 4.0, you need to install from MongoDB's repository.
Instructions to install MongoDB 4.0 and some hurdles I encountered follow:
1) Add the MongoDB repo
> sudo vi /etc/apt/sources.list.d/mongodb-org-4.0.list
deb [arch=amd64] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse
2) Add MongoDB the repo key
> sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
3) Update your system
> sudo apt-get update
4) Install MongoDB 4.0
> sudo apt-get install mongodb-org
5) Status and restart MongoDB
> sudo systemctl status mongod
> sudo systemctl restart mongod
Hurdles:
If MongoDB does not start, there may be some issues with removing the prior MongoDB version.
Errors I encountered:
error processing archive /var/cache/apt/archives/mongodb-org-server_4.0.10_amd64.deb (--unpack):
error trying to overwrite '/usr/bin/mongod', which is also in package mongodb-server-core 1:3.6.3-0ubuntu1.1
error trying to overwrite '/usr/bin/mongos', which is also in package mongodb-server-core 1:3.6.3-0ubuntu1.1
error trying to overwrite '/usr/bin/bsondump', which is also in package mongo-tools 3.6.3-0ubuntu1
Some potential fixes
> sudo apt --fix-broken install
This by it self did not help
Remove prior MongoDB and other unused packages
> sudo apt autoremove
This did fix the issue and allow me to run MongoDB 4.0
To get version of MongoDB
> mongo --version
Also, if you accidentally tried to get the version from the daemon
> mongodb --version
MongoDB will start as your user, often sudo/root,
which may cause some MongoDB files to be created as root.
You may have to reset user/group permissions
> sudo chown -R mongodb:mongodb /data/mongodb/
-End of Document-
Thanks for reading