...
Getting Started
1. Clone the WRF Git
For 4.0 Branch use:
Code Block | ||
---|---|---|
| ||
$ git clone https://github.com/wrf-model/WRF.git
... |
For 3.9.1.1 branch use
Code Block |
---|
$ git clone --branch V3.9.1.1 https://github.com/NCAR/WRFV3.git
... |
2. Load prerequisite modules
...
Code Block |
---|
$ export WRFIO_NCD_LARGE_FILE_SUPPORT=1 $ export NETCDF=$NETCDF_DIR |
4. Configure
Code Block |
---|
./configure |
There are 4 ways to configure for any given platform (one per line, see below)
serial - to be run on single processor
smpar - to be run with OpenMP on a single node
dmpar - to be run with MPI over one or more nodes.
dm+sm - to be run hybrid with OpenMP and MPI
...
NotesYou need to choose the desired build from the menu (e.g. 66 for Intel broadwell) and choose the domain nesting option (e.g. 1)
The nesting options as as follows:
basic: any nested domains are fixed (default)
preset moves: nested domain may move according to fixed parameters
vortex-following: For hurricane model forecasting
Code Block |
---|
$ ./configure
checking for perl5... no
checking for perl... found /usr/bin/perl (perl)
Will use NETCDF in dir: /global/software/centos-7/modules/intel/2019.1.144/netcdf/4.6.2-hpcx-2.4.0
HDF5 not set in environment. Will configure WRF for use without.
PHDF5 not set in environment. Will configure WRF for use without.
Will use 'time' to report timing information
$JASPERLIB or $JASPERINC not found in environment, configuring to build without grib2 I/O...
------------------------------------------------------------------------
Please select from among the following Linux x86_64 options:
1. (serial) 2. (smpar) 3. (dmpar) 4. (dm+sm) PGI (pgf90/gcc)
5. (serial) 6. (smpar) 7. (dmpar) 8. (dm+sm) PGI (pgf90/pgcc): SGI MPT
9. (serial) 10. (smpar) 11. (dmpar) 12. (dm+sm) PGI (pgf90/gcc): PGI accelerator
13. (serial) 14. (smpar) 15. (dmpar) 16. (dm+sm) INTEL (ifort/icc)
17. (dm+sm) INTEL (ifort/icc): Xeon Phi (MIC architecture)
18. (serial) 19. (smpar) 20. (dmpar) 21. (dm+sm) INTEL (ifort/icc): Xeon (SNB with AVX mods)
22. (serial) 23. (smpar) 24. (dmpar) 25. (dm+sm) INTEL (ifort/icc): SGI MPT
26. (serial) 27. (smpar) 28. (dmpar) 29. (dm+sm) INTEL (ifort/icc): IBM POE
30. (serial) 31. (dmpar) PATHSCALE (pathf90/pathcc)
32. (serial) 33. (smpar) 34. (dmpar) 35. (dm+sm) GNU (gfortran/gcc)
36. (serial) 37. (smpar) 38. (dmpar) 39. (dm+sm) IBM (xlf90_r/cc_r)
40. (serial) 41. (smpar) 42. (dmpar) 43. (dm+sm) PGI (ftn/gcc): Cray XC CLE
44. (serial) 45. (smpar) 46. (dmpar) 47. (dm+sm) CRAY CCE (ftn $(NOOMP)/cc): Cray XE and XC
48. (serial) 49. (smpar) 50. (dmpar) 51. (dm+sm) INTEL (ftn/icc): Cray XC
52. (serial) 53. (smpar) 54. (dmpar) 55. (dm+sm) PGI (pgf90/pgcc)
56. (serial) 57. (smpar) 58. (dmpar) 59. (dm+sm) PGI (pgf90/gcc): -f90=pgf90
60. (serial) 61. (smpar) 62. (dmpar) 63. (dm+sm) PGI (pgf90/pgcc): -f90=pgf90
64. (serial) 65. (smpar) 66. (dmpar) 67. (dm+sm) INTEL (ifort/icc): HSW/BDW
68. (serial) 69. (smpar) 70. (dmpar) 71. (dm+sm) INTEL (ifort/icc): KNL MIC
72. (serial) 73. (smpar) 74. (dmpar) 75. (dm+sm) FUJITSU (frtpx/fccpx): FX10/FX100 SPARC64 IXfx/Xlfx
Enter selection [1-75] : 66
------------------------------------------------------------------------
Compile for nesting? (1=basic, 2=preset moves, 3=vortex following) [default 1]: 1
... |
After the configure, you may edit the configure.wrf file to adjust as needed.
For example:
-xHost will not work on AMD
AVX2 may need to change to AVx512
...
remove
...
for Intel Skylake hosts
Edit mpicc, remove the “-f90=$(SFC)” option from mpif90 and the “-cc=$(SCC)” option from mpicc
...
for HPC-X (Open MPI) as it doesn’t need those options
...
==========================================================================
build started: Mon Aug 12 16:26:30 PDT 2019
build completed: Mon Aug 12 17:04:50 PDT 2019
---> Executables successfully built <---
-rwxrwxr-x 1 ophirm ophirm 53073520 Aug 12 17:04 main/ndown.exe
-rwxrwxr-x 1 ophirm ophirm 53039128 Aug 12 17:04 main/real.exe
-rwxrwxr-x 1 ophirm ophirm 52062736 Aug 12 17:04 main/tc.exe
-rwxrwxr-x 1 ophirm ophirm 60282704 Aug 12 17:03 main/wrf.exe
...
.
5. Compile
Code Block |
---|
./compile wrf |
6. Check the run directory
For input, make sure you have
Input file: wrfinput_d0* (at least 1) or wrfrst*
wrfbdy_d0* (at least 1)
namelist.input
7. Check the progress by running tail -f on rsl.out.0000
In this example, we can see that every 15 seconds or weather forecast, takes about 1.7 seconds.
Code Block |
---|
$ tail -f rsl.out.0000
WRF V3.9.1.1 MODEL
*************************************
Parent domain
ids,ide,jds,jde 1 1501 1 1201
ims,ime,jms,jme -4 195 -4 82
ips,ipe,jps,jpe 1 188 1 75
*************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
alloc_space_field: domain 1 , 299838928 bytes allocated
RESTART run: opening wrfrst_d01_2005-06-04_06_00_00 for reading
Timing for processing restart file for domain 1: 54.97731 elapsed seconds
Max map factor in domain 1 = 1.02. Scale the dt in the model accordingly.
INITIALIZE THREE Noah LSM RELATED TABLES
LANDUSE TYPE = USGS FOUND 27 CATEGORIES
INPUT SOIL TEXTURE CLASSIFICATION = STAS
SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES
d01 2005-06-04_06:00:00 WRF restart, LBC starts at 2005-06-04_00:00:00 and restart starts at 2005-06-04_06:00:00
LBC for restart: Starting valid date = 2005-06-04_00:00:00, Ending valid date = 2005-06-04_03:00:00
LBC for restart: Restart time = 2005-06-04_06:00:00
LBC for restart: Looking for a bounding time
LBC for restart: Starting valid date = 2005-06-04_03:00:00, Ending valid date = 2005-06-04_06:00:00
LBC for restart: Starting valid date = 2005-06-04_06:00:00, Ending valid date = 2005-06-04_09:00:00
LBC for restart: Found the correct bounding LBC time periods for restart time = 2005-06-04_06:00:00
Timing for processing lateral boundary for domain 1: 0.29368 elapsed seconds
Tile Strategy is not specified. Assuming 1D-Y
WRF TILE 1 IS 1 IE 188 JS 1 JE 75
WRF NUMBER OF TILES = 1
Timing for main: time 2005-06-04_06:00:15 on domain 1: 3.07004 elapsed seconds
Timing for main: time 2005-06-04_06:00:30 on domain 1: 1.74579 elapsed seconds
Timing for main: time 2005-06-04_06:00:45 on domain 1: 1.72297 elapsed seconds
Timing for main: time 2005-06-04_06:01:00 on domain 1: 1.73288 elapsed seconds
Timing for main: time 2005-06-04_06:01:15 on domain 1: 1.73057 elapsed seconds
Timing for main: time 2005-06-04_06:01:30 on domain 1: 1.72477 elapsed seconds
Timing for main: time 2005-06-04_06:01:45 on domain 1: 1.73379 elapsed seconds
Timing for main: time 2005-06-04_06:02:00 on domain 1: 1.73048 elapsed seconds
Timing for main: time 2005-06-04_06:02:15 on domain 1: 1.73523 elapsed seconds
Timing for main: time 2005-06-04_06:02:30 on domain 1: 1.72220 elapsed seconds
Timing for main: time 2005-06-04_06:02:45 on domain 1: 1.72852 elapsed seconds
Timing for main: time 2005-06-04_06:03:00 on domain 1: 1.73229 elapsed seconds
Timing for main: time 2005-06-04_06:03:15 on domain 1: 1.72407 elapsed seconds
Timing for main: time 2005-06-04_06:03:30 on domain 1: 1.72634 elapsed seconds
Timing for main: time 2005-06-04_06:03:45 on domain 1: 1.74795 elapsed seconds
|