COMSOL is a software platform that enables the interactive modeling of
a variety of physics-based problems, and their solution using advanced
numerical methods.

COMSOL can handle problems with coupling between different systems, and
various multiphysics cases. Specialized interfaces exist for electrical,
mechanical, fluid flow, and chemical applications.

COMSOL users may expect an interactive GUI interface. To get this

  • on NewRiver, login using the “-X” switch to request an Xterm:
    “ssh -X”;
  • Alternately, get interactive COMSOL access through ETX:

Note that the interactive session should be used to set up a problem,
but the problem definitions should be saved in an input file and
submitted for computation in a batch file, as in the example below.

COMSOL includes a variety of modules that deal with specialized
problems. The COMSOL modules installed on the ARC systems include:

  • AC/DC;
  • Acoustics;
  • CFD;
  • Chemical Reaction Engineering;
  • Corrosion;
  • Heat_Transfer;
  • LiveLink_for_MATLAB;
  • Microfluidics;
  • Multibody_Dynamics;
  • Nonlinear_Structural_Materials;
  • Optimization;
  • Structural_Mechanics;
  • Wave_Optics;

COMSOL is licensed software. Currently, ARC has purchased a
“5 seat” license for COMSOL, which means that up to 5 users at a
time may access COMSOL on ARC systems. Other users have made separate
licensing arrangement with the company, and notified the Virginia Tech
Information Technology department, which is able to add them to the
list of authorized users. Such users have guaranteed access to COMSOL
and are not affected by the 5 seat limit.

COMSOL can run in parallel, by the user setting the appropriate values in
the PBS job header, and in the commandline invocation of COMSOL.
In the header for the job script,

    #PBS -l nodes=???:ppn=???

the user specifies the number of nodes, and processors per node.
In the user portion of the job script, when the user invokes COMSOL,
the user must specify appropriate values for

  • -nn ??? (total number of processors)
  • -nnhost ??? (number of nodes)
  • -np ??? (number of cores per processor)
  • -f ??? (the hostfile)

The example below shows how these four values can be determined
automatically, so that the user only has to set the PBS
job header.

Web Site:

The website at



On any ARC cluster, check the installation details
by typing “module spider comsol”.

COMSOL requires that the appropriate modules be loaded before it can be run.
One version of the appropriate commands for use on NewRiver is:

module purge
module load comsol/


The following batch file runs COMSOL with the input file BeamModel.mph,
creating the solution file BeamSolution.mph. It uses 2 nodes and
4 processors per node, for a total of 8 processors. No matter what the
“#PBS -l nodes=???:ppn=???” line reads, the script can figure out the
appropriate values necessary to pass to COMSOL so that the job will run
correctly in parallel.

#! /bin/bash
#PBS -l walltime=00:05:00
#PBS -l nodes=2:ppn=4
#PBS -W group_list=newriver
#PBS -q open_q
#PBS -j oe
module purge
module load comsol/
#  Based on the above "#PBS nodes=???:ppn=???" statement, these three 
#  statements determine:
#    PROC_NUM, the number of processors
#    NODE_NUM, the number of nodes (servers)
#    CORE_NUM, the number of cores per processor (1, as we are not doing OpenMP)
NODE_NUM=`uniq $PBS_NODEFILE | wc -l`
#  This one statement is broken up by backslashes for readability.
comsol batch                      \
  -nn $PROC_NUM                   \
  -nnhost $NODE_NUM               \
  -np $CORE_NUM                   \
  -f $PBS_NODEFILE                \
  -inputfile BeamModel.mph        \
  -outputfile BeamSolution.mph    \
  -batchlog BeamModel.log > comsol_newriver.txt
if [ $? -ne 0 ]; then
  echo "COMSOL_NEWRIVER: Run error!"
  exit 1
echo "COMSOL_NEWRIVER: Normal end of execution."
exit 0

A complete set of files to carry out a similar process are available in