Please use this identifier to cite or link to this item: https://hdl.handle.net/11681/44963
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGallagher, Alex R.-
dc.contributor.authorLeGrand, Sandra L.-
dc.contributor.authorHodgdon, Taylor S.-
dc.contributor.authorLetcher, Theodore W.-
dc.creatorCold Regions Research and Engineering Laboratory (U.S.)-
dc.creatorGeospatial Research Laboratory (U.S.)-
dc.date.accessioned2022-08-05T16:31:53Z-
dc.date.available2022-08-05T16:31:53Z-
dc.date.issued2022-08-
dc.identifier.govdocERDC TR-22-11-
dc.identifier.urihttps://hdl.handle.net/11681/44963-
dc.identifier.urihttp://dx.doi.org/10.21079/11681/44963-
dc.descriptionTechnical Reporten_US
dc.description.abstractDust aerosols can pose a significant detriment to public health, transportation, and tactical operations through reductions in air quality and visibility. Thus, accurate model forecasts of dust emission and transport are essential to decision makers. While a large number of studies have advanced the understanding and predictability of dust storms, the majority of existing literature considers dust production and forcing conditions of the underlying meteorology independently of each other. Our study works to-wards filling this research gap by inventorying dust-event case studies forced by convective activity in the Desert Southwest United States, simulating select representative case studies using several configurations of the Weather Research and Forecasting (WRF) model, testing the sensitivity of forecasts to essential model parameters, and assessing overall forecast skill using variables essential to dust production and transport. We found our control configuration captured the initiation, evolution, and storm structure of a variety of convective features admirably well. Peak wind speeds were well represented, but we found that simulated events arrived up to 2 hours earlier or later than observed. Our results show that convective storms are highly sensitive to initialization time and initial conditions that can preemptively dry the atmosphere and suppress the growth of convective storms.en_US
dc.description.sponsorshipUnited States. Army. Corps of Engineers.en_US
dc.description.tableofcontentsAbstract ................................................................................................................................................... ii Figures and Tables .................................................................................................................................. v Preface ..................................................................................................................................................... x 1 Introduction ..................................................................................................................................... 1 1.1 Background ..................................................................................................................... 1 1.2 Objectives ........................................................................................................................ 2 1.3 Approach ......................................................................................................................... 2 2 Methodology ................................................................................................................................... 3 2.1 Supporting datasets ....................................................................................................... 4 2.1.1 Storm Prediction Center Hourly Mesoscale Analysis (SPCHMA) ....................................... 4 2.1.2 Next Generation Weather Radar (NEXRAD) ....................................................................... 5 2.1.3 Automated Surface Observing Stations (ASOS) ................................................................. 6 2.1.4 Remote Automatic Weather Stations (RAWS) ................................................................... 7 2.2 Model configuration ........................................................................................................ 8 2.2.1 Model resolution ................................................................................................................ 10 2.2.2 Initialization and lateral boundary datasets (model forcing data) .................................. 11 2.2.3 Planetary-boundary-layer schemes .................................................................................. 12 2.2.4 Surface-layer schemes ...................................................................................................... 14 2.2.5 Land surface models ......................................................................................................... 14 2.2.6 Cumulus schemes ............................................................................................................. 15 2.2.7 Cloud microphysics schemes ............................................................................................ 16 2.2.8 Initialization time (model spin-up) .................................................................................... 17 2.3 Model assessment ....................................................................................................... 19 2.4 Case studies .................................................................................................................. 21 3 Results ........................................................................................................................................... 24 3.1 4 July 2014 case study ................................................................................................. 25 3.1.1 4 July 2014 event overview .............................................................................................. 25 3.1.2 4 July 2014 simulation results ......................................................................................... 28 3.1.3 Model sensitivity analysis .................................................................................................. 36 3.2 21 August 2014 case study ......................................................................................... 45 3.2.1 21 August 2014 event overview ....................................................................................... 45 3.2.2 21 August 2014 simulation results .................................................................................. 48 3.3 28 June 2015 case study ............................................................................................. 58 3.3.1 28 June 2015 event overview .......................................................................................... 58 3.3.2 28 June 2015 simulation results ..................................................................................... 61 3.4 1–2 August 2017 case study ....................................................................................... 69 3.4.1 1–2 August 2017 event overview .................................................................................... 69 3.4.2 1–2 August 2017 simulation results ............................................................................... 75 3.5 30 July 2018 case study .............................................................................................. 85 3.5.1 30 July 2018 event overview ............................................................................................ 85 3.5.2 30 July 2018 simulation results ....................................................................................... 88 4 Discussion ..................................................................................................................................... 98 5 Conclusions and Recommendations ...................................................................................... 102 5.1 General conclusions ................................................................................................... 102 5.2 Recommendations ...................................................................................................... 103 References ......................................................................................................................................... 105 Appendix A : SPC Storm Report Database Search Algorithm ..................................................... 109 Appendix B : Full Inventory of Case Studies .................................................................................. 116 Acronyms and Abbreviations ........................................................................................................... 118 Report Documentation Page ........................................................................................................... 120-
dc.format.extent132 pages / 26.27 MB-
dc.format.mediumPDF-
dc.language.isoen_USen_US
dc.publisherEngineer Research and Development Center (U.S.)en_US
dc.relation.ispartofseriesTechnical Report (Engineer Research and Development Center (U.S.)) ; no. ERDC TR-22-11-
dc.rightsApproved for Public Release; Distribution is Unlimited-
dc.sourceThis Digital Resource was created in Microsoft Word and Adobe Acrobat-
dc.subjectAir qualityen_US
dc.subjectClimateen_US
dc.subjectComputer simulationen_US
dc.subjectDusten_US
dc.subjectDust stormsen_US
dc.subjectGeospatial dataen_US
dc.subjectSouthwestern Statesen_US
dc.subjectWeatheren_US
dc.titleSimulating environmental conditions for Southwest United States convective dust storms using the Weather Research and Forecasting Model v4.1en_US
dc.typeReporten_US
Appears in Collections:Technical Report

Files in This Item:
File Description SizeFormat 
ERDC TR-22-11.pdf26.27 MBAdobe PDFThumbnail
View/Open