CP2K

DETAILS
[+/-]

Widgets

Widgets<bs-widget-edit>

Wanted pages
Who is online?
From SNUWIKI
Jump to: navigation, search
CP2K.png

CP2K

CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems. CP2K provides a general framework for different modeling methods such as DFT using the mixed Gaussian and plane waves approaches GPW and GAPW. Supported theory levels include DFTB, LDA, GGA, MP2, RPA, semi-empirical methods (AM1, PM3, PM6, RM1, MNDO, …), and classical force fields (AMBER, CHARMM, …). CP2K can do simulations of molecular dynamics, metadynamics, Monte Carlo, Ehrenfest dynamics, vibrational analysis, core level spectroscopy, energy minimization, and transition state optimization using NEB or dimer method. (Detailed overview of features.)

CP2K is written in Fortran 2008 and can be run efficiently in parallel using a combination of multi-threading, MPI, and CUDA. It is freely available under the GPL license. It is therefore easy to give the code a try, and to make modifications as needed.

Binary Location:

/data/apps/cp2k/cp2k-2022.2.cpu/exe/local/cp2k.popt


Version: cp2k-2023.1, cp2k-2022.2

Job Script:

/bin/bashSBATCH --partition=medium # include the appropriate partition, node, ntasks, ntasks-pernode for the respective SLURM keySBATCH --ntasks=128

SBATCH --cpus-per-task=1

  1. SBATCH --job-name=2p2k_benchmark256_4
    #SBATCH --output cp2k.%J.out
    #SBATCH --error cp2k.%J.err


Inputs###

modify the appropriate version of the executable

export exec_name=/data/apps/cp2k/cp2k-2022.2.cpu/exe/local/cp2k.popt
export INP=/data/home/deepak.agrawal/cp2k_jobs/cpu/em.inp
## keep the input file on the local directory where this job is launched

ulimit -s unlimited
module purge
module load compiler/2022.0.2 mkl/2022.0.2 mpi/2021.2.0

export I_MPI_FABRICS=shm:ofi
export UCX_TLS=sm,ud
export UCX_NET_DEVICES=mlx5_0:1
export FI_PROVIDER=mlx
export UCX_UNIFIED_MODE=y
export UCX_USE_MT_MUTEX=y

export OMP_NUM_THREADS=1
#export I_MPI_PIN_ORDER=bunch
#export I_MPI_PIN_DOMAIN=core


Application Execution###

{ time mpiexec.hydra -np $SLURM_NPROCS -genvall ${exec_name} ${INP}; } > cp2k_${SLURM_JOB_ID}.std.out 2>&1


Authors