Search This Blog

Friday, April 19, 2019

killall command and signals.

killall sends a SIGTERM signal (if signal name is not specified explicitly) to all processes running any of the specified commands. 

[Syntax]
killall [options] command

killall supports regular expression matching of process names, via the -r option.

If the specified command contains slash (/) characters, the command is interpreted as a file name and processes running that particular file will be selected as signal recipients.

To make the killall operation interactive you can provide -i option which makes killall prompt for confirmation before sending signal.


Signals are software interrupts sent to a program to indicate that an important event has occurred.

You can use the options -s, --signal, and -SIGNAL options to send explicit signals.

[Example]
killall -s 9 chrome
killall -SIGKILL python

Some common signals :
SIGINT 2 - Issued if the user sends an interrupt signal (Ctrl + C).
SIGQUIT 3 - Issued if the user sends a quit signal (Ctrl + D).
SIGKILL 9 - If a process gets this signal it must quit immediately and will not perform any clean-up operations.

Thursday, March 14, 2019

prettify json file in vim.

:%!python -m json.tool

Thursday, December 6, 2018

read constantly updating file

tail -F foo.txt

[options]
-F: tracks changes to the file by filename and try to open the file if it's not present.

Thursday, November 15, 2018

delete column of csv

File: tmp.csv

Content:
a,b,c
1,2,3
10,20,30

Task: Remove 1st column.

Expected Output:
b,c
2,3
20,30

Command:
cut -d, -f1 --complement tmp.csv > new_tmp.csv

Options:
-d : delimeter
-f : fields to select
--complement: complement the set of selected bytes, characters or fields

Summary:
Above command is selecting the 1st column with a delimiter , and complementing the result set and saving output to new_tmp.csv.

Sunday, October 21, 2018

less command

less is the linux utility which can be used to read contents of text page by page.

[Syntax]
less [OPTIONS] [FILENAME]

[Options]
-N : Show number of lines.

[Navigation]
gg - Top
G - Bottom
f - Forward One Window
b - Backward One Window
q - Exit

[Search]
\ - Forward Search
? - Backward Search
n - Next Occurrence
N - Previous Occurrence

[Example]
~ less foo.txt
~ ls | less -N
   1. foo.txt
   2. bar.txt

Tuesday, September 11, 2018

Redirect output of echo to root owned files

sudo and echo command does not work together if you have to redirect output of echo to root owned files you can execute

sudo sh -c "echo 'bar' > /etc/foo.conf"

Wednesday, August 1, 2018

resume wget download

[Command]
wget -c https://foo[.]com/bar.mp4

To resume interrupted download from wget you can use -c flag which will continue from where it stopped last time.

[Options]
-c -- continue: continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program.
If there is a file named bar.mp4 in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.

Tuesday, July 31, 2018

sudo bang bang 🔫

sudo !! (pronounced sudo bang bang)
!! is used to repeat the last command.

If you forgot to give root privilege to a command instead of using up arrow, going to the beginning of a command and typing sudo you can just do sudo !! 




Tuesday, July 24, 2018

saving linux man page.

A man page (short for manual page) is a form of software documentation usually found on a Unix or Unix-like operating system.
The standard location of the man page is /usr/share/man with man sections as subdirectories (e.g: 1 for user commands, 2 for system calls, 3 for library functions etc.)

File of man pages are in gzip compressed file format!
Gzip reduces the size of the named files using Lempel-Ziv coding (LZ77). Whenever possible, each file is replaced by one with the extension .gz, while keeping the same ownership modes, access and modification times.

We can use zcat to decompress these files!
zcat uncompresses either a list of files on the command line or its standard input and writes the uncompressed data on standard output. zcat will uncompress files that have the correct magic number whether they have a .gz suffix or not.

zcat /usr/share/man/man1/ls.1.gz

man pages are in troff format!
troff features commands to designate fonts, spacing, paragraphs, margins, footnotes and more. Unlike many other text formatters, troff can position characters arbitrarily on a page, even overlapping them, and has a fully programmable input language.

We can convert troff into pdf format by using groff!
Groff (GNU troff) is a typesetting system that reads plain text mixed with formatting commands and produces formatted output. The output may be PostScript or PDF, HTML, or ASCII/UTF8 for display at the terminal.

[Final Command]
zcat /usr/share/man/man1/ls.1.gz | groff -mandoc  > lsmanpage.pdf

Sunday, July 22, 2018

nload

nload is a console application which monitors network traffic and bandwidth usage in real time. It visualizes the in- and outgoing traffic using two graphs and provides additional info like the total amount of transferred data and min/max network usage.

*You can switch between the devices by pressing the left and right arrow keys.

[Example]
~ nload
~ nload wlp6s0

[Output]

Friday, July 20, 2018

Number Lines

nl writes each FILE to standard output, with line numbers added.
With no FILE, or when FILE is -, it read standard input.

[Example]
~ nl population.txt
    1  China
    2  India
    3  United States 

~ nl -s : population.txt
    1:China
    2:India
    3:United States
~ nl -n rz --number-width=4 population.txt
    0001  China
    0002  India
    0003  United States 


[Options]

-s : seprator
-n : number format
ln - left justified, no leading zeros
rn - right justified, no leading zeros
rz - right justified, leading zeros
 


Wget

GNU Wget is a free utility for non-interactive download of files from the Web.
It supports HTTP, HTTPS, and FTP protocols, as well as retrievalthrough HTTP proxies.

Wget has been designed for robustness over slow or unstable connections; if a download fails it will keep retrying until the whole file has  been retrieved. If the server supports regetting, it will instruct the server to continue the download from whereit left off.

[Example]

wget http://foo[.]com/bar[.]mp4
Downlaod bar.mp4 in current directory.

wget -i urls.txt

Iterate through urls in file urls.txt.

wget -nc http://foo[.]com/bar[.]mp4
Download bar.mp4 only if it is not available in current directory.


wget -O tmp[.]mp4 http://foo[.]com/bar[.]mp4

Download bar.mp4 as tmp.mp4 in current directory.



[Options]

-i : Read URLs from a local or external file.
-nc : Do not retrieve file if it already exists in current folder.
-O : Save output as.