- 积分
- 47559
- 贡献
-
- 精华
- 在线时间
- 小时
- 注册时间
- 2012-11-14
- 最后登录
- 1970-1-1
|
发表于 2017-10-23 16:46:32
|
显示全部楼层
Instructions:
wget:
1. Save the list of URLs to your local workstation as myfile.dat
2. Create a ~/.netrc file pointing to urs.earthdata.nasa.gov and an empty ~/.urs_cookies file
3. On your command line, using wget 1.18 ( or higher ):
wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies -i myfile.dat
If you get an error of Filename too long use the --content-disposition option in your wget call, or -O option for single file download
a UNIX curl example:
1. Save the list of URLs to your local workstation as myfile.dat
2. re-arrange the urls and provide output filenames as described in the curl config file format described below:
url = http://host.example.gov/dir/filename.hdf
output = filename.hdf
url = http://host.example.gov/dir/filename2.hdf
output = filename2.hdf
...
3. Create a ~/.netrc file pointing to urs.earthdata.nasa.gov and an empty ~/.urs_cookies file
4. On your command line:
curl -b ~/.urs_cookies -c ~/.urs_cookies -K ./myfile.dat
更详细的看这里
https://disc.gsfc.nasa.gov/infor ... rvice%20with%20wget
当然我没有在windows下设置过netrc,如果你是在windows下使用,可以自行百度下netrc的设置方法。 |
|