Monday, July 11, 2016

Useful Linux Commands


Tarring:
 ​​tar -pczf tar_file_name.tar.gz folder_name
Untarring:  ​​
tar -zxvf tar_file_name.tar.gz
Zipping:
  zip -r zip_file_name.zip folder_name
Unzipping:
 unzip zip_file_name.zip -d folder_name

Create a symlink:
 ln -s src_file_link symlink_file_name
Remove a symlink:
 unlink link_name

SCP 

​​​​ scp rondovu_1_0_24_sep_2011.tar.gz root@dev.myserver.com:/var/www/html/
tar -zxvf tar_file_name.tar.gz

Rsync
​​ rsync --partial --progress --rsh=ssh /home/user/tar_file_name.tar.gz root@dev.myserver.com:/var/www/tar_file_name.tar.gz
​​
List the size of files/folders in a directory
​​ du --max-depth=1 -h ./ | sort -n -r

Count the lines of code in a project
 find . -name '*.php' | xargs wc -l
​​ find . -name '*.js' | xargs wc -l

Search for a text in files with regex
 egrep "User|Group|SuexecUserGroup" /etc/apache2/apache2.conf /etc/apache2/sites-available/*.conf

Know System Info (whether 32 or 64 bit)
 file /usr/bin/file

Mysql dump and restore
​​mysqldump -u root -p db_name > db_name.sql
mysqldump -u root -p --datbases db_name1 db_name2 db_name3 > db_name.sql
mysqldump -u root -p --all-databases > db_name.sql
mysql -u root -p db_name < db_name.sql
mysql --force -u root -p db_name < db_name.sql // To import without stopping when there are no errors

Analyse Apache Access Log to view traffic


To view requests per day
awk '{print $4}' access_log | cut -d: -f1 | uniq -c

To view requests per hour
grep "12/Jul/2016" access_log | cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":00"}' | sort -n | uniq -c


To view requests per minute
grep "12/Jul/2016:11" access_log | cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":"$3}' | sort -nk1 -nk2 | uniq -c | awk '{ if ($1 > 10) print $0}'

If your logs are gzip'ed by logrotate, prefix another command to the above commands 
zcat /site/mysite.com/logs/access.log.gz | awk '{print $4}' | cut -d: -f1 | uniq -c