Organization:Arctic Region Supercomputing Center
Established | 1993 |
---|---|
Affiliation | University of Alaska Fairbanks |
Director | Dr. Gregory Newby |
Administrative staff | 30 |
Location | Fairbanks, Alaska , |
Website | arsc.edu |
The Arctic Region Supercomputing Center (ARSC) was from 1993 to 2015 a research facility organized under the University of Alaska Fairbanks (UAF). Located on the UAF campus, ARSC offered high-performance computing (HPC) and mass storage to the UAF and State of Alaska research communities.
In general, the research supported with ARSC resources focused on the Earth's arctic region. Common projects included arctic weather modeling, Alaskan summer smoke forecasting, arctic sea ice analysis and tracking, Arctic Ocean systems, volcanic ash plume prediction, and tsunami forecasting and modeling.
ARSC was a Distributed Center (DC), an Allocated Distributed Center (ADC) and then one of six DoD Supercomputing Resource Centers (DSRCs) of the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) from 1993 through 2011.
History
ARSC hosted a variety of HPC systems many of which were listed as among the Top 500 most powerful in the world. For more than 10 years ARSC maintained the standing of at least one system on the Top 500 list.[1] Funding for ARSC operations was primarily supplied by the DoD HPCMP, with augmentation through UAF and external grants and contracts from various sources such as the National Science Foundation. In December 2010, the Fairbanks Daily News-Miner reported probable layoffs for most of Arctic Region Supercomputer Center's 46 employees with the loss of its Department of Defense contract in 2011.[2] The article reported that 95 percent of ARSC funding comes from the Department of Defense. When that DoD funding source was lost ARSC could no longer afford computers that could be listed on the Top 500 List.
The following timeline includes various HPC systems acquired by ARSC and a Top 500 list standing when appropriate:
- 1993 - Cray Y-MP named Denali with 4 CPUs and 1.3 GFLOPS, StorageTek 1.1 TB Silo. With this system the Arctic Region Supercomputing Center (ARSC) was #251 on the first Top 500 Supercompter list in June 1993[3] published on the first day of the 8th Mannheim Supercomputer Seminar. This Cray Y-MP M98/41024 system[4] remained on the list the next two times while dropping to position #302 then #405 before falling off the list.
- 1994 - Cray T3D named Yukon with 128 CPUs and 19.2 GFLOPS. ARSC had two of the computers on the June 1994 Top 500 list[5] with the new Cray T3D MC128-2 getting #58 on the list and the previous computer Denali was still on the list at position #405. The Cray T3D MC128-2[6] was #55 on the November 1994 Top 500 list.
- 1995 - ARSC nabbed the #83 spot on the June 1995 Top 500 list[7] by upgrading to a Cray T3D MC128-8[8] which maintained a spot on the Top 500 list through 1997.
- 1997 - ARSC got position #70 on the June 1997 Top 500 list[9] by upgrading Yukon to a Cray T3E[10] with 88 CPUs and 50 GFLOPS. Position #62 on the November 1997 Top 500 list[11] was obtained with another upgrade to the T3E900[12] with 96 cores. HPC Wire mentions the Cray Y-MP Denali, the visualization labs, the ARSC Video Production Lab in an article about the Cray T3E installation at the Arctic Region Supercomputing Center.[13]
- 1998 - Cray J90 named Chilkoot with 12 CPUs and 2.4 GFLOPS, Expanded StorageTek to 330+ TB. ARSC gets #74 on the November 1998 Top 500 list[14] with another upgrade of the T3E900[15] to 100 cores.
- 1999 - ARSC got spot #44 on the June 1999 Top 500 list[16] after Yukon was upgraded again to a Cray T3E900 with 268 cores[17] and was able to stay on the Top 500 list through June 2002.
- 2000 - Updated Chilkoot to a Cray SV1 with 32 CPUs and 38.4 GFLOPS, Doubled StorageTek Hardware.
- 2001 - An IBM SP named Icehawk with 200 CPUs and 276 GFLOPS [18] got ARSC the #117 spot on the June 2001 Top 500 list[19] and stayed on the list through 2002. This gave ARSC the status of having a system on the Top 500 list every single time in its first decade of existence.
- 2002 - Cray SX-6 named Rime with 8 CPUs and 64 GFLOPS, IBM P690 Regatta named Iceflyer with 32 POWER4 CPUs and 166.4 GFLOPS. Cray tells CNET that Arctic Region Supercomputer Center got the first Cray SV1ex upgrade.[20]
- 2003 - ARSC got spot #116 on the June 2003 Top 500 list[21] with a Cray X1 named Klondike with 60 cores[22] and then spot #71 on the November 2003 Top 500 list[23] by upgrading the Cray X1 to 124 cores[24] for a system that stayed on the Top 500 list through June 2005. IBM told C|Net the Iceberg system was worth more than $15 million and cited $16.4 million going to Cray for the X1.[25]
- 2004 - ARSC got spot #56 on the June 2004 Top 500 list[26] with its IBM System p named Iceberg. In 2003 IBM had told InfoWorld that this system would put ARSC in sixth place on the Top 500 list.[27] This RISC/UNIX-based Power4+ eServer with 672 cores[28] remained on the list through June 2006. ARSC also acquired two Sun Fire 6800 Storage Servers and a Mechdyne MD Flying Flex 4 projector to set up a Cave automatic virtual environment.
- 2005 - Cray XD1 named Nelchina with 36 CPUs.
- 2007 - A Sun Opteron Cluster named Midnight with 2,236 cores and 12 TFLOPS[29] got ARSC the #206 spot on the November 2007 Top 500 list.[30] ARSC also installed a StorageTek SL8500 Robotic Tape Library with 3+ PetaByte capacity.
- 2008 - A Cray XT5 name Pingo with 3,456 cores[31] got ARSC the #109 spot on the November 2008 Top 500 list[32] and stayed on the list through June 2010.
- 2009 - IBM BladeCenter H QS22 Cluster with 5.5 TFLOPS and 12 TB Filesystem.
- 2010 - Penguin Computing Cluster named Pacman with 2080 CPUs and 89 TB Filesystem, Sun SPARC Enterprise T5440 Server named Bigdipper with 7 Petabyte Storage Capacity, Cray XE6 named Chugach with 11648 CPUs and 330 TB Filesystem, Sun SPARC Enterprise T5440 Server named Wiseman with 7 Petabyte Storage Capacity, Cray XE6 named Tana with 256 CPUs and 2.36 TFLOPS HPC Wire talks of ARSC, one of six HPCMP centers, losing DoD funding and Chugach, a Cray XE6 ‘Baker’ supercomputer, which was part of a recent big procurement under HPCMP originally for ARSC being moved to Vicksburg and being run remotely.[33] The Chugach Cray XE6 was on the November 2010 Top 500 list in position #83 and stayed on the list until June 2013 but was credited without a machine name and as an ERDC DSRC system.[34]
- 2011 - Expanded Pacman to 3256 CPUs and 200 TB Filesystem. Penguin Computing issued a press release about Pacman (Pacific Area Climate Monitoring and Analysis Network) [35] Although the majority of the HPCMP funding has ended by the end of 2011, ARSC is still operating the Chugach machine remotely during a transition period. and the HPCMP Quick Links, HPCMP User Support (CCAC), and HPCMP User Accounts links are still prominently displayed on the ARSC website as well as the Chugach Cray XE6 system.[36]
- 2012 - ARSC is down to half the staff of when they were a HPCMP DSRC.[37]
- 2013 - ARSC's last Top 500 Supercomputer Chugach, a Cray XE6, was upgraded to 23,296 cores for slot #130 in November 2012 and then slot #183 in June 2013 on the Top 500 List. After 2011 and the transition period operations were transferred to a new Open Research Systems (ORS) unit of the HPCMP at the ERDC DSRC.
- 2014 - ARSC is down to 20% of the staff of when they were a HPCMP DSRC by the end of 2014.[38]
- 2015 - Arctic Region Supercomputing Center ceased to exist on September 1, 2015. Former ARSC systems were acquired by the Research Computing Systems unit at University of Alaska Fairbanks's Geophysical Institute. The original website is now a dead URL.[39]
References
- ↑ For more than 10 years ARSC maintained the standing of at least one system on the Top 500 list."University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2010-06-30. https://www.top500.org/site/49130. Retrieved 2019-07-06.
- ↑ ARSC, located at the University of Alaska Fairbanks since 1993, had a budget of about $12 million last year. About 95 percent of that funding comes from the Department of Defense, which has a local contract to provide military and security assistance.Jeff Richardson (2010-12-03). "Arctic Region Supercomputer Center may lose DoD contract; layoffs expected". Fairbanks Daily News-Miner. https://www.newsminer.com/news/alaska_news/arctic-region-supercomputer-center-may-lose-dod-contract-layoffs-expected/article_f0b8c7d3-cc63-5b7a-981f-06424a365caa.html. Retrieved 2019-07-06.
- ↑ With the Cray Y-MP named Denali Arctic Region Supercomputing Center (ARSC) was #251 on the first Top 500 Supercompter list in June 1993 published on the first day of the 8th Mannheim Supercomputer Seminar."TOP500 List - June 1993". The TOP500 project. 1993-06-30. https://www.top500.org/lists/top500/list/1993/06/?page=3. Retrieved 2019-07-06.
- ↑ This Cray Y-MP M98/41024 system was #251 on the 06/1993 Top 500 list, #302 on the 11/1993 Top 500 list, #405 on the 06/1994 Top 500 list."Cray Y-MP M98/41024 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1994-06-30. https://www.top500.org/system/173015. Retrieved 2019-07-06.
- ↑ A Cray T3D named Yukon got #58 on the June 1994 Top 500 Supercompter list while the previous computer Denali was still on the list at position #405 giving ARSC two of the Top 500 Supercompters at the same time."TOP500 List - June 1994". The TOP500 project. 1994-06-30. https://www.top500.org/lists/top500/list/1994/06/?page=1. Retrieved 2019-07-06.
- ↑ This ARSC Cray T3D MC128-2 was #58 on the June 1994 Top 500 list and #55 on the November 1994 Top 500 list."T3D MC128-2 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1994-06-30. https://www.top500.org/system/170963. Retrieved 2019-07-06.
- ↑ The Cray T3D named Yukon was upgraded to a Cray T3D MC128-8 for spot #83 on the June 1995 Top 500 list."TOP500 List - June 1995". The TOP500 project. 1994-06-30. https://www.top500.org/lists/top500/list/1995/06/?page=1. Retrieved 2019-07-06.
- ↑ This ARSC Cray T3D MC128-8 was on the Top 500 list for three years: #83 06/1995, #99 12/1995, #127 06/1996, #169 11/1996, #239 06/1997 and #340 11/1997"Cray T3D MC128-8 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1997-11-30. https://www.top500.org/system/170949. Retrieved 2019-07-06.
- ↑ The Cray T3D named Yukon was upgraded again to a Cray T3E got get ARSC position #70 on the June 1997 Top 500 list."TOP500 List - June 1997". The TOP500 project. 1997-06-30. https://www.top500.org/lists/top500/list/1997/06/?page=1. Retrieved 2019-07-06.
- ↑ Yukon was only on the Top 500 list once in this configuration."Cray T3E at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1997-06-30. https://www.top500.org/system/171140. Retrieved 2019-07-06.
- ↑ Another upgrade to the T3E900 got ARSC position #62 on the November 1997 Top 500 list."TOP500 List - November 1997". The TOP500 project. 1997-11-30. https://www.top500.org/lists/top500/list/1997/11/?page=1. Retrieved 2019-07-06.
- ↑ Yukon was on the Top 500 list in this configuration as #62 in 11/1997 and as #67 in 06/1998."Cray T3E900 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1997-06-30. https://www.top500.org/system/171205. Retrieved 2019-07-06.
- ↑ "Arctic Region Supercomputing Center installs Cray T3E". Tabor Communications. 1997-03-07. https://www.hpcwire.com/1997/03/07/arctic-region-supercomputing-center-installs-cray-t3e/. Retrieved 2019-07-06.
- ↑ ARSC gets #74 ARSC gets #74 on the November 1998 Top 500 list with another upgrade to the T3E900."TOP500 List - November 1998". The TOP500 project. 1998-11-30. https://www.top500.org/lists/top500/list/1998/11/?page=1. Retrieved 2019-07-06.
- ↑ Yukon was only on the Top 500 list once in this configuration."Cray T3E900 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 1997-06-30. https://www.top500.org/system/171143. Retrieved 2019-07-06.
- ↑ ARSC got spot #44 on the June 1999 Top 500 list after Yukon was upgraded again to a Cray T3E900 with 268 cores."TOP500 List - June 1999". The TOP500 project. 1999-06-30. https://www.top500.org/lists/top500/list/1999/06/?page=1. Retrieved 2019-07-06.
- ↑ Yukon is on the Top 500 list in this configuration for over three years: #44 06/1999, #56 11/1999, #78 06/2000, #107 11/2000, #131 06/2001, #199 11/2001 and #383 06/2002"Cray T3E900 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2002-06-30. https://www.top500.org/system/171169. Retrieved 2019-07-06.
- ↑ The IBM Icehawk RISC system got ARSC on the Top 500 list as #117 in June of 2001, #147 in November of 2001, #264 in June of 2002 and #367 in November of 2002."IBM SP Power3 375 MHz at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2002-11-30. https://www.top500.org/system/170021. Retrieved 2019-07-06.
- ↑ An IBM SP named Icehawk with 200 CPUs and 276 GFLOPS got ARSC the #117 spot on the June 2001 Top 500 list."TOP500 List - June 2001". The TOP500 project. 2001-06-30. https://www.top500.org/lists/top500/list/2001/06/?page=2. Retrieved 2019-07-06.
- ↑ Stephen Shankland (2002-01-02). "Compaq, Cray: Supercomputer progress". CBS Interactive Inc.. https://www.cnet.com/news/compaq-cray-supercomputer-progress/. Retrieved 2019-07-06.
- ↑ ARSC got spot #116 on the June 2003 Top 500 list with a Cray X1 named Klondike with 60 cores."TOP500 List - June 2003". The TOP500 project. 2003-06-30. https://www.top500.org/lists/top500/list/2003/06/?page=2. Retrieved 2019-07-06.
- ↑ Klondike was only on the Top 500 list once in this configuration."Cray X1 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2003-06-30. https://www.top500.org/system/173136. Retrieved 2019-07-06.
- ↑ Upgrading the Cray X1 to 124 cores got ARSC spot #71 on the November 2003 Top 500 list."TOP500 List - November 2003". The TOP500 project. 2003-11-30. https://www.top500.org/lists/top500/list/2003/11/?page=1. Retrieved 2019-07-06.
- ↑ Klondike in this configuration on the Top 500 list as #71 in 11/2003, #154 in 06/2004, #202 in 11/2004 and #353 in 06/2005."Cray X1 at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2003-06-30. https://www.top500.org/system/173282. Retrieved 2019-07-06.
- ↑ Stephen Shankland (2003-03-13). "IBM nabs hot supercomputer deal in arctic". CBS Interactive Inc.. https://www.cnet.com/news/ibm-nabs-hot-supercomputer-deal-in-arctic/. Retrieved 2019-07-06.
- ↑ ARSC got spot #56 on the June 2004 Top 500 list with its IBM System p named Iceberg."TOP500 List - June 2004". The TOP500 project. 2004-06-30. https://www.top500.org/lists/top500/list/2004/06/?page=1. Retrieved 2019-07-06.
- ↑ In 2003 IBM had told InfoWorld that Iceberg would put ARSC in sixth place on the Top 500 list.Tom Krazit (2003-03-13). "IBM supercomputer heads north - Arctic center to use 800-processor box". IDG Communications, Inc.. https://www.infoworld.com/article/2680585/ibm-supercomputer-heads-north.html. Retrieved 2019-07-06.
- ↑ The IBM Iceberg RISC system got ARSC Top 500 list spots #56 in 06/2004, #78 in 11/2004, #136 in 06/2005, #253 in 11/2005 and #398 in 06/2006."IBM eServer pSeries 655 (1.5 GHz Power4+) at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2006-06-30. https://www.top500.org/system/173296. Retrieved 2019-07-06.
- ↑ Midnight was only on the Top 500 list once."Midnight - Fire x2200/x4600 Cluster, Opteron 2.6 Ghz, Infiniband, Linux system at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2007-11-30. https://www.top500.org/system/175225. Retrieved 2019-07-06.
- ↑ A Sun Opteron Cluster named Midnight with 2,236 cores and 12 TFLOPS got ARSC the #206 spot on the November 2007 Top 500 list."TOP500 List - November 2007". The TOP500 project. 2007-11-30. https://www.top500.org/lists/top500/list/2007/11/?page=3. Retrieved 2019-07-06.
- ↑ Pingo got ARSC Top 500 list spots #109 in 11/2008, #205 in 06/2009, #290 in 11/2009 and #435 in 06/2010."Cray XT5 QC 2.3 GHz at University of Alaska - Arctic Region Supercomputing Center". The TOP500 project. 2010-06-30. https://www.top500.org/system/176146. Retrieved 2019-07-06.
- ↑ A Cray XT5 name Pingo with 3,456 cores got ARSC the #109 spot on the November 2008 Top 500 list."TOP500 List - November 2008". The TOP500 project. 2008-11-30. https://www.top500.org/lists/top500/list/2008/11/?page=2. Retrieved 2019-07-06.
- ↑ A pre-Thanksgiving email to ARSC confirmed what many at the center had suspected, namely that the center would lose its DoD funding after the current money expires next May. Today the center is funded to the tune of $12 to $15 million, and the DoD slice represents around 95 percent of the total.Michael Feldman (2010-01-22). "Arctic Region Supercomputing Center Gets Cold Shoulder from DoD". Tabor Communications. https://www.hpcwire.com/2010/01/22/cir_report_states_that_40_100_gige_transceiver_markets_to_reach_545m_by_2014/. Retrieved 2019-07-06.
- ↑ "Cray XE6, Opteron 16C 2.500GHz, Cray Gemini interconnect". The TOP500 project. 2013-06-30. https://www.top500.org/system/176960. Retrieved 2019-07-06.
- ↑ "Penguin Computing Supplies PACMAN Supercomputer to University of Alaska Fairbanks". SMART Global Holdings. 2011-05-24. https://www.penguincomputing.com/company/press-releases/penguin-computing-supplies-pacman-supercomputer-to-university-of-alaska-fairbanks/. Retrieved 2019-07-06.
- ↑ "Arctic Region Supercomputing Center - Chugach Cray XE6 system". The Internet Archive, a 501(c)(3) non-profit. http://www.arsc.edu/arsc/resources/chugach/index.xml. Retrieved 2019-07-07.
- ↑ "Arctic Region Supercomputing Center Staff". The Internet Archive, a 501(c)(3) non-profit. http://www.arsc.edu/arsc/misc/staff/. Retrieved 2019-07-07.
- ↑ "Arctic Region Supercomputing Center Staff". The Internet Archive, a 501(c)(3) non-profit. http://www.arsc.edu/arsc/misc/staff/. Retrieved 2019-07-07.
- ↑ "Arctic Region Supercomputing Center". The Internet Archive, a 501(c)(3) non-profit. http://www.arsc.edu/arsc/. Retrieved 2019-07-07.
External links
[ ⚑ ] 64°51′36″N 147°50′57″W / 64.8600°N 147.8491°W
Original source: https://en.wikipedia.org/wiki/Arctic Region Supercomputing Center.
Read more |