- 积分
- 3799
- 贡献
-
- 精华
- 在线时间
- 小时
- 注册时间
- 2015-7-6
- 最后登录
- 1970-1-1
|
登录后查看更多精彩内容~
您需要 登录 才可以下载或查看,没有帐号?立即注册
x
本帖最后由 Lighting 于 2017-4-13 07:58 编辑
首先声明一点:这不是一个遇到错误没有思考,直接产生的帖子。我查阅了资料,也百度,谷歌了错误信息,当然,可能我的关注不够细致导致没有发现网络上的解决办法。所以还请各位遇到或是明白这些的朋友看一下,给个提示也好。谢谢!
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
后来更改成大型机上最新版的并行库,重新编译了一次就好了。
又仔细回想了一下:在编译的过程中千万不要随便更改环境变量,编译器,库文件等,尤其是在两个路径下编译时,否则出现的问题可能你自己都想不到。
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
编译器:intel 11 并行库:openmpi
在大型机上使用WRF3.3.1版本进行真实个例模拟,如果不嵌套模拟则能正常运行,如果嵌套模拟就会出现错误。
目前已经进行过的操作:
1) 更改过积分步长,增大减小都试过,仍然出错
2) 更换过并行库,有那么一瞬间可以使用,但是之后就不行了
简单说一下那一瞬间:重新编译的WRF,但是没有编译WPS,所有文件均是重新编译之前生成的,采用的依然是单层模拟。
3) 更改过运算节点数和核数,在嵌套模拟时依然出现上述错误
麻烦各位帮忙看看,如果信息不够的话我可以再提供。谢谢!
---------------------------------------------------------------------错误信息---------------------------------------------------------------------------------------------------
1) 第一次出现的错误
starting wrf task 1 of 8
starting wrf task 7 of 8
starting wrf task 3 of 8
starting wrf task 6 of 8
starting wrf task 0 of 8
starting wrf task 5 of 8
starting wrf task 4 of 8
starting wrf task 2 of 8
starting wrf task 1 of 8
starting wrf task 3 of 8
starting wrf task 7 of 8
starting wrf task 2 of 8
starting wrf task 4 of 8
starting wrf task 5 of 8
starting wrf task 6 of 8
starting wrf task 0 of 8
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 24475 on node c18n12 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
2)第二次出现的错误
starting wrf task 6 of 8
starting wrf task 0 of 8
starting wrf task 3 of 8
starting wrf task 5 of 8
starting wrf task 7 of 8
starting wrf task 2 of 8
starting wrf task 4 of 8
starting wrf task 1 of 8
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 10283 on
node c01n03 exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------namelist 文件-----------------------------------------------------------------------------------------
&time_control
run_days = 1,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2015, 2015, 2012, 2012, 2012,
start_month = 04, 04, 05, 05, 05,
start_day = 28, 28, 29, 29, 29,
start_hour = 00, 00, 12, 18, 12,
start_minute = 00, 00, 00, 00, 00,
start_second = 00, 00, 00, 00, 00,
end_year = 2015, 2015, 2012, 2012, 2012,
end_month = 04, 04, 05, 05, 05,
end_day = 29, 29, 30, 30, 31,
end_hour = 00, 00, 06, 06, 12,
end_minute = 00, 00, 00, 00, 00,
end_second = 00, 00, 00, 00, 00,
interval_seconds = 21600
input_from_file = .true.,.false.,.false.,
history_interval = 20, 30, 40, 20, 60,
frames_per_outfile = 12, 8, 10, 30, 1,
restart = .false.,
restart_interval = 500000,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
debug_level = 0
/
&domains
time_step = 10,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 2,
s_we = 1, 1, 1, 1, 1,
e_we = 100, 202, 229, 256, 241,
s_sn = 1, 1, 1, 1, 1,
e_sn = 80, 160, 313, 343, 286,
s_vert = 1, 1, 1, 1, 1,
e_vert = 48, 48, 48, 48, 48,
sfcp_to_sfcp = .false.
dx = 3000, 1000, 4500, 1500, 500,
dy = 3000, 1000, 4500, 1500, 500,
grid_id = 1, 2, 3, 4, 5,
parent_id = 0, 1, 2, 3, 4,
i_parent_start = 0, 15, 57, 66, 88,
j_parent_start = 0, 15, 50, 150, 231,
parent_grid_ratio = 1, 3, 3, 3, 3,
parent_time_step_ratio = 1, 3, 3, 3, 3,
feedback = 0,
smooth_option = 0,
num_metgrid_levels = 27,
num_metgrid_soil_levels = 4,
p_top_requested = 5000,
eta_levels = 1.000,0.997,0.994,0.991,0.988,0.985,0.980,0.975,0.970,0.960,0.950,
0.940,0.930,0.920,0.910,0.895,0.880,
0.865,0.850,0.825,0.800,0.775,0.750,
0.720,0.690,0.660,0.630,0.600,0.570,
0.540,0.510,0.475,0.440,0.405,0.370,
0.330,0.290,0.250,0.210,0.175,0.145,
0.115,0.090,0.065,0.045,0.025,0.010,
0.000,
smooth_option = 0
/
&physics
mp_physics = 6, 6, 6, 6, 6,
gsfcgce_hail = 0,
gsfcgce_2ice = 0,
ra_lw_physics = 1, 1, 5, 1, 1,
ra_sw_physics = 1, 1, 5, 1, 1,
radt = 10, 10, 0, 30, 30,
sf_sfclay_physics = 2, 2, 2, 1, 1,
sf_surface_physics = 2, 2, 0, 2, 2,
bl_pbl_physics = 2, 2, 0, 1, 1,
bldt = 0, 0, 0, 0, 0,
cu_physics = 0, 0, 0, 0, 0,
cudt = 10, 5, 5, 5, 5,
surface_input_source = 1,
num_soil_layers = 4,
maxiens = 1,
maxens = 3,
maxens2 = 3,
maxens3 = 16,
ensdim = 144,
/
&fdda
/
&dynamics
w_damping = 1,
diff_opt = 1,
km_opt = 4,
diff_6th_opt = 1,
diff_6th_factor = 0.12,
damp_opt = 0,
base_temp = 290.
zdamp = 5000., 5000., 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2, 0.2, 0.2,
khdif = 0, 500, 500, 0, 0,
kvdif = 0, 500, 500, 0, 0,
non_hydrostatic = .true., .true., .true., .true., .true.,
time_step_sound = 6, 6, 6, 4, 4,
h_mom_adv_order = 5, 5, 5, 5, 5,
v_mom_adv_order = 5, 5, 5, 5, 3,
h_sca_adv_order = 5, 5, 5, 5, 5,
v_sca_adv_order = 5, 5, 5, 3, 3,
moist_adv_opt =4,
scalar_adv_opt =3,
/
&bdy_control
spec_bdy_width = 5,
spec_zone = 1,
relax_zone = 4,
specified = .true., .false.,
nested = .false., .true.,
/
&grib2
/
&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
|
|