Basic CURL Commands

curl is a command line tool to transfer data to or from a server. It is able to use any of the supported protocols like HTTP, FTP, IMAP, POP3, SCP, SFTP, SMTP, TFTP, TELNET, LDAP or FILE. This tool is very useful for automation, since it is designed to work without user interaction. Furthermore, curl can transfer multiple file at once.

Basic Single URL Usage

% curl https://thelinuxcluster.com/

2a. Save the Download File with a preferred file name

Save the Download File on the local machine with the name provided with the parameter.

% curl -o test.o https://thelinuxcluster.com/test.output

2b. Save the Download File

% curl -O https://thelinuxcluster.com/test.output

2c. Download Multiple Files. Just Multiple -O

% curl -O https://thelinuxcluster.com/CentOS1.iso -O https://thelinuxcluster.com/CentOS2.iso -O https://thelinuxcluster.com/CentOS3.iso

3a. Display a Progress Meter

% curl --progress-bar -o test.o https://thelinuxcluster.com/test.output 

3b. Do not display a Progressive Bar

% curl --silent -o test.o https://thelinuxcluster.com/test.output

4 Limit Rate of Data Transfer

% curl --limit-rate 1000K -o test.o https://thelinuxcluster.com/CentOS1.iso

5a Uploading a File to the FTP Server

% curl -u username:userpassword -T myfile ftp://ftp.thelinuxcluster.com/

5b. Appending the File to the FTP Server

% curl -u username:userpassword -a -T myfile ftp://ftp.thelinuxcluster.com/

5c Downloading the File to the File Server

% curl ftp:/ftp.thelinuxcluster.com/CentOS79.iso --user username:userpassword -o myCentOS79.iso

6a. Verifying SSL Certificate

% curl --cacert server.crt https://thesupersecureserver.com

6b. Ignoring SSL Certificate

% curl -k https://thesupersecureserver.com/

7a Proxy Server

% curl -x proxy_name:proxy_port https://thelinuxcluster.com

7b Proxy Server which requires authentication

% curl --user username:userpassword -x proxy_name:proxy_port https://thelinuxcluster.com 

8 Sending Email

% curl --url "smtps"//smtp.thelinuxcluster.com:465: --ssl-reqd --mail-from "sender@thelinuxcluster.com" --mail-rcpt "receiver@thelinuxcluster.com" --upload-file maincontent.txt --user "sender@thelinuxcluster.com:password" --insecure

References:

  1. Learn to use CURL command with examples
  2. Curl command in Linux with Examples

Finding physical cpus, cores and logical cpus

Number of Active Physical Processors

% grep physical.id /proc/cpuinfo | sort -u | wc -l
8

Number of cores per CPU

% grep cpu.cores /proc/cpuinfo | sort -u
cpu cores       : 26

Therefore the total number of cores is
8×26 = 208 cores

Number of Logical Processors

The number of cores seen by the Linux. Since the Server has switched on Hyperthreading

% grep processor /proc/cpuinfo | wc -l
416

References:

  1. How to find the number of physical cpus, cpu cores, and logical cpus

Finding Top Processes using Highest Memory and CPU Usage in Linux

Read this Article from Find Top Running Processes by Highest Memory and CPU Usage in Linux. This is a quick way to view processes that consumed the largest RAM and CPU

ps -eo pid,ppid,cmd,%mem,%cpu --sort=-%mem | head
   PID   PPID CMD                         %MEM %CPU
414699 414695 /usr/local/ansys_inc/v201/f 20.4 98.8
 30371      1 /usr/local/pbsworks/pbs_acc  0.2  1.0
 32241      1 /usr/local/pbsworks/pbs_acc  0.2  4.0
 30222      1 /usr/local/pbsworks/pbs_acc  0.2  0.6
  7191      1 /usr/local/pbsworks/dm_exec  0.1  0.8
 30595      1 /usr/local/pbsworks/pbs_acc  0.1  3.1
 30013      1 /usr/local/pbsworks/pbs_acc  0.1  0.3
 29602  29599 nginx: worker process        0.1  0.2
 29601  29599 nginx: worker process        0.1  0.3

The -o is to specify the output format. The -e is to select all processes. In order to sort in descending format, it hsould be –sort=%mem

Interesting.

Checking the Limits an application is imposed during run

If you wish to look at a specific application limits during run, you can do the following

pgrep fortcom
12345

* I used for fortcom, but it could be any application you wish to take a look.

cat /proc/12345/limits
Limit Soft Limit Hard Limit Units
Max cpu time unlimited unlimited seconds
Max file size unlimited unlimited bytes
Max data size unlimited unlimited bytes
Max stack size 8388608 unlimited bytes
Max core file size 0 unlimited bytes
Max resident set unlimited unlimited bytes
Max processes 4096 2190327 processes
Max open files 1024 4096 files
Max locked memory unlimited unlimited bytes
Max address space unlimited unlimited bytes
Max file locks unlimited unlimited locks
Max pending signals 2190327 2190327 signals
Max msgqueue size 819200 819200 bytes
Max nice priority 0 0
Max realtime priority 0 0
Max realtime timeout unlimited unlimited us

* You can take a look that there is no limits to Max locked Memory and Max file locks are unlimited.

Using Find and Tar Together to Backup and Archive

Point 1: If you wish to find files in a single folder and tar them into gzip-compressed archive. You can use a one-liner to do it.

find -maxdepth 1 -name '*.sh' | tar czf script.tgz -T -

“-maxdepth” refers to the current depth of 1 or current directory

“-T -” causes tar to read its list from a file rather than the command line. The “-” means standard input and output.

You should have a file is script.tgz

 

Point 2: If you wish to find files in a single folder and tar them into bzip2-compressed archive.

find -maxdepth 1 -name '*.sh' | tar cjf script.tgz -T -

Checking Disk Usage within the subfolders

I like this command which I often use to look into the dish usages at the sub folder level to check for large usages

du -h -d 1
1.3M    ./Espresso-BEEF
65M     ./MATLAB
478M    ./Abaqus
10G     ./COMSOL
8.3M    ./Gaussian2
316K    ./scripts
4.9M    ./NB07
647M    ./pytorch-GAN
31M     ./Gaussian
12G     .

where

  • -h refers to human-readable
  • -d refers to depth level. By default, it is 0 which is the same as summarize