Quick Understanding on swap

Swap Space is virtual memory, using your HDD when you run out of memory. The system swaps some of the contents out of the RAM to the HDD (swap), then bring it back when required.

In the past, when RAM was very small in the single digit of GB or less, we take the rule of 2 times the memory. But with large memory available in your Server, it may not be necessary to configure as much, as we only need as much as we can suspend to disk. I like to use between 16GB to 32GB swap

To control the tendency for the system to use the swap. Configure the vm.swappiness at /etc/sysctl.conf. It is the percentage of memory free before using swap. If you have lots of memory, you can use it as low as 10 from the default 60.

Do take a look at Quick Understanding on Swap

Advertisement

Immersion Cooling Showcase – TACC Lonestar6 Supercomputing

As one of the world’s most successful and sustainable immersion-cooled data centers, it’s critical for TACC to overcome the pressures every data center face nowadays — increasing performance, trimming CapEx/OpEx, and developing a more sustainable operation. They turned to immersion cooling to overcome these pressures.

Watch the video Immersion Cooling Showcase – TACC Lonestar6 Supercomputing

Gaussian Error – $’\r’: command not found

If you see errors like

/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 2: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 5: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 8: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 11: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 16: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 19: $'\r': command not found
/var/spool/pbs/mom_priv/jobs/729107.hpc-mn1.SC: line 22: $'\r': command not found

These errors are usually due to Windows-style newline characters that can cause issues. Please use the commands

$ dos2unix yourfile

This will remove the Windows-style newline characters

Protecting Centrify Zones from accidental deletion on Active Directory

If you have been using Centrify for some time, Centrify store Zones and other objects within the Active Directory (AD) or OU. One question always surface, how to protect the objects from accidental deletion. There are 2 ways. The first way is the easiest way.

Method 1: (via Manual Way to disable ‘accidental deletion’ for specific AD object only):

  1. Ask your System Administrator or OU Administrator to open up the “Active Directory Users and Computers” application.
  2. 2Navigate to your intended AD object (or any AD object like your ‘Zone’).
  3. 3) Right-click on your intended AD object, and select ‘Properties’.
  4. 4) Click on the ‘Object’ tab.
  5. 5) Ensure to check the checkbox of ‘Protect object from accidental deletion’.
  6. 6) Click the ‘Apply’ and then the ‘Ok’ button to confirm the changes.

Method 2: (via Powershell to disable ‘accidental deletion for all objects under specified OU ):

1) Ask your System Administrator to open up the ‘Power Shell’ application.

2) For the command below modify the ‘distingushedName’ (DN name) so that it points to the OU relevant to your domain. The below command will set this for all objects in the specified OU:

    Powershell: Get-ADobject -Filter * -SearchBase “{DN_Name}” | Set-adobject -ProtectedFromAccidentalDeletion $true

   
Example Command (for centrify  ‘Zone’ OU):: Get-ADobject -Filter * -SearchBase “CN=Zones,CN=Centrify,CN=Program Data,DC=win16org22,DC=pmm” | Set-adobject -ProtectedFromAccidentalDeletion $true

(Take Note; In order to attain the DN name, Right-click on your intended AD object, > select ‘Properties > and Click on the ‘Attribute Editor’ tab > Click on the ‘distinguishedName’ column > Copy the DN name and paste it in the PowerShell command specified above)

(Take Note: This creates a “deny” for deletion of all the objects under the specified OU. Now whoever tries to delete this will generate an event. Hence, the user will have to remove this permission before the object can be deleted.)

Preparing a Linux Client Server for Centrify and 2FA for CentOS-7

Preliminary Notes:

You have to setup a Cloud Tenant from Centrify by registering an email with Centrify or Centrify Authorised Reseller.

Once the Tenant has been setup, the login link should have been sent to the email you have provided.

You will need to setup the 2FA Connector VM on premise. The recommended specification of the connectors. Port 443 should be opened for the VM.

  • 4 Core; 8GB RAM; 100 GB HDD; Windows 2016 or later

At the Active Directory

  1. Create UNIX computer group in AD if not already created
  2. Add the UNIX computers that will require 2FA to the UNIX group
  3. Create a UNIX Users group if not already created
  4. Add Users that will require 2FA to the UNIX user group
  5. Add the IWA root CA Certificate to the Centrify GPO. The IWA Certificate can be downloaded from the Centrify cloud but the connector needs to be setup first before we can download the IWA Certificate.

At the CentOS Server

Copying the IwaTRustRoot.pem Certificate to CentOS Linux Server

  1. Change the extension of the IWA certificate that was downloaded from .cer to .pem
  2. For CentOS, please copy the certificate to this location /etc/pki/ca-trust/source/anchors/ in the test server.
  3. Copy the cert to /var/centrify/net/certs as well

Configure the SSH settings

# vim /etc/ssh/sshd_config
# To disable tunneled clear text passwords, change to no here!
#PasswordAuthentication yes
#PermitEmptyPasswords no
PasswordAuthentication no


# Change to no to disable s/key passwords
#ChallengeResponseAuthentication yes
ChallengeResponseAuthentication yes

Restart the SSHD Services

# systecmtl restart sshd.service

Restart the Centrifydc services


# /usr/share/centrifydc/bin/centrifydc restart

Active Directory Flush

# adflush -f

Can ChatGPT write Ansible playbooks that work?

This is an interesting article from OpenSource.com Can ChatGPT write Ansible playbooks that work? I will just put the conclusion from the author

If you are trying to learn or if you have no clue about how to perform a certain programming task, ChatGPT can show you some examples that may or may not work the way you want. This can be useful in some situations, because if you search for examples in a search engine, you may find thousands of references that you need to evaluate, interpret, and test, versus having to so for the single result ChatGPT provides. Reading manuals is always recommended, but sometimes you must read pages and pages until you find one applicable example.

ChatGPT is also useful if you just want a quick example to give you ideas or help you remember a module or function that you’ve already used before.

But I would not recommend you take anything provided by the AI and use it without fully understanding, validating, and testing it. Especially if you need to use it in a production environment. Well, this general advice is applicable to ANYTHING you find on the internet. I am just being obvious.

Can ChatGPT write Ansible playbooks that work?

SKT significantly speed up launch of Korean ChatGPT

The article is taken from SKT doubles supercomputer capacity to speed up launch of Korean ChatGPT

SK telecom Co. (SKT) has doubled the capacity of its supercomputer, which serves as the brain of its artificial intelligence model AIDAT, as competition mounts to develop and launch a chatbot that relies on generative AI after the release of Open AI’s ChatGPT.

SKT announced Sunday that “Titan,” the supercomputer that is the basis for its super-giant AI advancement since 2021, expanded its capacity to 1,040 NVIDIA A100 GPUs.

Titan supports performance of more than 17.1 petaflops. One petaflop can calculate speed of a computer equal to one 1,000 trillion times per second, and 17.1 petaflop means the computer is capable of computing 17.1 trillion times per second.

SKT doubles supercomputer capacity to speed up launch of Korean ChatGPT

Red Hat CloudForms 5.0 will reach the end of its life as of March 12, 2023

Red Hat CloudForms 5.0 will reach the end of its life as of March 12, 2023, and there will be no other supported versions of CloudForms by Red Hat. After this date technical or general support, updates, and security fixes will no longer be available. More information about Red Hat CloudForms’ can be found in the Red Hat Statement of Direction

Nautilus start slowly with errors

If you are having a slow startup and facing a Nautilus Issue like this

[user1@node1 ~]$ nautilus

** (nautilus:3369252): WARNING **: 14:40:58.988: Error on getting connection: Failed to load SPARQL backend: GDBus.Error:org.freedesktop.DBus.Error.NoReply: Message recipient disco                      nnected from message bus without replying

(nautilus:3369252): GLib-GIO-CRITICAL **: 14:52:11.952: g_dbus_connection_signal_unsubscribe: assertion 'G_IS_DBUS_CONNECTION (connection)' failed

(nautilus:3369252): GLib-GObject-CRITICAL **: 14:52:11.952: g_object_unref: assertion 'G_IS_OBJECT (object)' failed

(nautilus:3369252): GLib-GObject-CRITICAL **: 14:52:11.952: g_object_unref: assertion 'G_IS_OBJECT (object)' failed

(nautilus:3369252): GLib-GObject-WARNING **: 14:52:11.952: invalid (NULL) pointer instance

(nautilus:3369252): GLib-GObject-CRITICAL **: 14:52:11.952: g_signal_connect_data: assertion 'G_TYPE_CHECK_INSTANCE (instance)' failed
Error creating proxy: Error calling StartServiceByName for org.gtk.vfs.GoaVolumeMonitor: Timeout was reached (g-io-error-quark, 24)

The solution is by clearing the .cache/tracker

$ rm -rf .cache/tracker/

Log Off and Log On again. You should be able to run nautilus without issues