1. Articles in category: Supercomputer

    49-72 of 99 « 1 2 3 4 5 »
    1. Top 5 Data Center Stories, Week of Oct. 20

      Top 5 Data Center Stories, Week of Oct. 20
      The Week in Review: Google brings you inside its data centers and talks about cooling, the petascale supercomputers Blue Waters and NCAR Yellowstone are launched, and RagingWire plans a major new campus in Ashburn's Data Center Alley. The post Top 5 Data Center Stories, Week of Oct. 20 appeared...
      Read Full Article
    2. Could A Lunar Supercomputer Work?

      Could A Lunar Supercomputer Work?

      Over the years, we’ve seen data centers in lots of strange places – old chapels, shopping malls, old particle accelerator silos and all kinds of underground bunkers. Proposals have been floated for containers in caves, data centers on floating barges and servers mounted on hovering drones. Now a researcher has come up with an out-of-this-world computing challenge – building a supercomputer on the moon.

      Read Full Article
      Mentions: Apple Google Cisco
    3. Inside the Future Exascale Data Center

      Inside the Future Exascale Data Center

      The main data center at Oak Ridge National Laboratory in Tennessee powers some of the world’s most advanced data-crunching to support research in weather, climate science, quantum physics. The Department of Energy plans to deploy an Exascale computing system here by 2018. 

      Read Full Article
    4. Top 10 Data Center Stories, June 2012

      Top 10 Data Center Stories, June 2012

      During the month of June, stories about supercomputers, Stephen Hawking and downtime for Amazon’s cloud were among the top items that interested Data Center Knowledge readers. Also, Facebook’s data center designs, such as those for servers and racks, continued to be a leading topic on our site. Here are the most popular stories on Data Center Knowledge for June 2012, ranked by page views: Top 10 Supercomputers Illustrated – June 18 Hawking is First User of SGI ‘Big Brain’ Supercomputer – June 14 Video: Amazing LEGO Data Center – June 14 Seven Cloud Computing Trends – Part 1 – June 4 Amazon Data Center Loses Power During Storm – June 30 More Problems for Amazon EC2 Cloud – June 29 Power Outage Affects Amazon Customers – June 15 Closer Look: Facebook’s New Server, Storage Designs – June 27 Netflix Rolls Out Its Own Content Delivery Network – June 27 eBay: Bloom Boxes Will Power Utah Data Center ...

      Read Full Article
    5. Fujitsu sets up Spanish HPC hub

      Fujitsu has chosen Spain as its fifth hub for High Performance Computing (HPC) in Europe. It said Spain is predicted to have about 5% of the market for supercomputing in EMEA. It said much activity in the market in EMEA is triggered by a major global race amongst companies to have the most powerful supercomputers and most energy efficient IT environment. Fujitsu’s program for supercomputing is based on what it calls the Human Centric Intelligent Society, which is hoping to encourage new initiati
      Read Full Article
      Mentions: Fujitsu Europe Emea
    6. Innovative Nordic Supercomputer in Iceland

      Innovative Nordic Supercomputer in Iceland

      National High Performance Computing (HPC) organisations of Denmark, Norway, Sweden and Iceland have pooled resources and powered up an innovative joint supercomputer in Iceland. It is innovative not so much for its technology, but for its concept, placement and operations. The supercomputer is hosted in the Advania Thor Data Center in Iceland, a new environmentally friendly “green” data center. The computer is part of a pilot initiative aiming to test remote hosting, such that computing is brought to the energy source and not vice versa, as is the norm, thereby introducing substantial savings. Further aims are to understand the political, organizational and technical aspects of joint ownership, administration and operation of such expensive and strategic infrastructure. Read more: http://www.icenews.is/index.php/2012/04/20/innovative-nordic-supercomputer-in-iceland/#ixzz1snJVlVlr

      Read Full Article
      Mentions: Iceland Europe Norway
    7. LBNL Plans For the Exascale Data Center

      LBNL Plans For the Exascale Data Center

      Last week, Lawrence Berkeley National Laboratory (LBNL) broke ground on a facility that will house its vision for the supercomputer of the future. The 140,000 square foot data center will overlook the San Francisco Bay on a hill above the UC Berkeley campus. It may also provides the first view into exascale – the new frontier for supercomputing. Planning for supercomputers that can surpass current petaflop levels to exaFLOPS (1,000,000,000,000,000,000 Floating Point Operations per Second ), the U.S. Department of Energy has recognized that energy consumption for powering that compute load is a particular challenge.

      Read Full Article
    8. Illinois supercomputing center gets LEED Gold

      The newly built data center at the University of Illinois that will soon support one of the world’s most powerful supercomputers received LEED Gold certification from the US Green Building Council. Even with 24MW of critical load, the National Petascale Computing Facility on site of the university’s National Center for Supercomputing Applications (NCSA) was given the USBGC’s second-highest award in recognition of an energy efficient design and a construction process that minimized impact on the
      Read Full Article
    9. Blue Waters Data Center Achieves LEED Gold

      Blue Waters Data Center Achieves LEED Gold
      The National Center for Supercomputing Applications (NCSA) announced that the National Petascale Computing Facility (NPCF) at the University of Illinois has been earned a Gold-level certification under the LEED (Leadership in Energy and Environmental Design) rating program for energy-efficient buildings. Blue Waters Constructed in 2010 the University of Illinois and NCSA opened the NPCF data center as the home to supercomputers and other high-performance systems operated by NCSA and used by scientists and engineers across the country. The Blue Waters project encompassed the NPCF and a 10 petaflop supercomputer, which was initially a venture with IBM. In 2011 NCSA and IBM determined that the project was too complex to proceed. IBM pulled the plug and NCSA later awarded they contract to Cray to build a XE6 system.
      Read Full Article
      Mentions: IBM LEED
    10. TACC Builds Data Center for New Supercomputer

      TACC Builds Data Center for New Supercomputer
      The Texas Advanced Computing Center (TACC) at The University of Texas at Austin announced that it is expanding the center’s current high performance computing data center to house the new Stampede supercomputer, which will be built in late 2012 and go into full production to the national science community in January 2013. The $56 million project will encompass a machine room and raised floor expansion, a separate building to include the transformer yard, a location to house the chillers, compressors and cooling towers, a tank for thermal energy storage, and an additional seminar room for training. The funds will also pay for the long-term upgrades to support the infrastructure of future projects. In this video Dan Stanzione, Deputy Director, Texas Advanced Computing Center talks about the power and cooling requirements of the expanded facility. Run time is about 2 minutes, 45 seconds.
      Read Full Article
    11. HPC News, SGI, Blue Waters, Dell

      HPC News, SGI, Blue Waters, Dell
      Here’s our review of today’s noteworthy links for the High Performance Computing (HPC) industry: Cray delivers first Blue Waters Cabinet. On December 1 Cray delivered the first full cabinet for the NCSA Blue Waters system. A photo gallery of the installation day can be found on the NCSA Facebook album, where in the comments it is confirmed that the cabinets will be water-cooled. The National Science Foundation’s Blue Waters project was awarded to Cray last month after NCSA and IBM terminated the original contract last summer. Dell’s HPC Strategy. The Register reports on how Dell is going to engage the market to grow its HPC strategy. The primary focus for Dell’s HPC strategy is to concentrate on smaller HPC systems where projects are well-bounded with known workloads and customers they know and understand. Dell is putting together recipes for popular HPC apps in small, medium ...
      Read Full Article
    12. NCSA Blue Waters Project Awarded To Cray

      NCSA Blue Waters Project Awarded To Cray
      NCSA and Cray announced that they have finalized a contract with the University of Illinois’ National Center for Supercomputing Applications (NCSA) to provide the supercomputer for the National Science Foundation’s Blue Waters project. Back in August NCSA and IBM jointly announced that IBM has terminated its contract with the University of Illinois. The Blue Waters Infrastructure The multi-phase, multi-year project was awarded to Cray for $188 million and will start with a Cray XE6 system, upgrading to the recently announced Cray XK6 with built-in GPU computing capability. Bill Kramer, deputy project director of the Blue Waters project at the NCSA at the University of Illinois, told The Register that Blue Waters was not a specific system, but rather a complete set of infrastructure, including a data center, plus computation, networking, and storage and, most importantly given the software goals of the NCSA, code that scales to real-world petaflops performance.
      Read Full Article
    13. IBM Files Patent For 100 Petaflop Supercomputer

      IBM Files Patent For 100 Petaflop Supercomputer
      IBM has filed a patent for a massive supercomputing system that could reach 107 petaflops, more than 12 times the compute power of the current leader in the Top 500 supercomputer rankings. Powered by Blue Gene Last month IBM unveiled the Blue Gene/P and /Q systems that will use the A2 processing core and achieve upwards of 20 petaflops (quadrillion floating-point operations per second). The new patent describes the interconnected ASIC nodes using a five-dimensional torus network and is listed as being “capable of achieving 107 petaflop with up to 8,388,608 cores, or 524,288 nodes, or 512 racks is provided.”
      Read Full Article
    14. IBM To Power 20 Petaflop Supercomputer

      IBM To Power 20 Petaflop Supercomputer
      IBM lifted the curtain on its Blue Gene/Q SoC last week in Santa Clara and noted that it will soon be installed in two of the most powerful Blue Gene systems ever deployed. Power 7 vs. SoC With the plug pulled on the 10 petaflop Power7-based Blue Waters for NCSA, IBM is working with two Department of Energy labs for a 10 petaflop ‘Mira’ system at Argonne National Lab and a 20 petaflop “Sequoia” at Lawrence Livermore. The current top supercomputer in the world, the Japanese K, can sustain 8.162 petaflops. The Power-7 chip was set to perform at 256 gigaflops per 8 cores and consume 200 watts, where the Blue Gene/Q SoC will pull 204 gigaflops per processor, with an 18 core count, and consumes 55 watts at peak. With a significant increase in performance the Blue Gene/Q chip delivers 15 times as many peak ...
      Read Full Article
    15. NSA building $896M supercomputing center

      NSA building $896M supercomputing center
      The NSA's new High Performance Computing Center, slated to be complete by December 2015, will be designed to with energy efficiency, security, and lots of "state-of-the-art" computing horsepower in mind, according to unclassified specs found in the documents, which detail numerous military construction project budgets, including several NSA efforts. NSA has long been a supercomputing powerhouse. The secretive signals intelligence agency purchased the first Cray supercomputer in 1976, and even keeps two Cray supercomputers on display at its National Cryptologic Museum alongside spy gadgets such as centuries-old code books and a working German Enigma machine from World War II. The specs for the new supercomputing center read much like the NSA is building a massive data center, with typical requirements for raised flooring, chilled water systems, fire suppression, and alarms. Power requirements are 60 megawatts, equivalent to the power requirements of Microsoft's recently completed 700,000 square foot ...
      Read Full Article
    16. Building A Sturdy Data Center Roof

      Building A Sturdy Data Center Roof
      We’ve seen a lot of videos that look at various aspects of data center construction, including many time-lapse videos providing an accelerated view of the process. Here’s a new one: a video that focuses on the construction of the data center roof for the new Swiss National Supercomputing Centre (CSCS) in Lugano. This 1-minute clip provides a sense of the infrastructure for a strong roof, which is an important consideration in buildings where heavy equipment will be stored on the rooftop. Each roof beam is 35 meters long and weights 50 tons, and are moved into place by a mobile crane that weighs 380 tons.
      Read Full Article
    17. Biggest Problem for Exascale Computing: Power

      Biggest Problem for Exascale Computing: Power
      Follow Reuters Facebook Twitter RSS YouTube READ Special Report: STD fears sparked case against WikiLeaks boss 07 Dec 20101Spain gets debt warning before EU summit 10:43am EST2Murdered Alabama children were tortured: documents 09 Dec 20103UPDATE 4-Greek police clash with anti-austerity protesters 8:10am EST4Time names Mark Zuckerberg 2010 Person of the Year 9:58am EST5 DISCUSSED 62 WikiLeaks backers hit MasterCard and Visa in cyberstrike 58 Tax deal moves forward despite doubts 56 Democrats defy Obama, oppose tax deal WATCHED The year in 60 seconds Tue, Dec 14 2010 Bejeweled bra exposed in NY Thu, Oct 21 2010 U.S. Navy breaks railgun record Mon, Dec 13 2010 BROKER CENTER Special Advertising Feature Trade Now at Fidelity Biggest Problem for Exascale Computing: Power Tweet This Share on LinkedIn Share on Facebook Digg More from Earth2Tech Abound Solar Snags Ample Funding for 775 MW of Factories 14 Dec 2010 Smart ...
      Read Full Article
    18. Google Unveils Earth Engine to Save World’s Forests

      Google Unveils Earth Engine to Save World’s Forests
      Protecting the world’s forests will be a crucial way to fight climate change, given deforestation contributes to more carbon emissions than all vehicles combined. Now Google has emerged as a key warrior in the deforestation battle. On Thursday morning in Cancun, Mexico at the COP 16 U.N. climate negotiations, the search engine giant unveiled Google Earth Engine, a product which combines an open API, a computing platform and 25 years of satellite imagery available to researchers, scientists, organizations and government agencies. While the software and satellite imagery in Google Earth are already being used to look at world climate change data, Google Earth Engine offers tools and parallel processing computing power to groups to be able to use satellite imagery to analyze environmental conditions in order to make sustainability decisions.
      Read Full Article
    49-72 of 99 « 1 2 3 4 5 »
  1. Categories

    1. Data Center Design:

      Construction and Infrastructure, Container, Data Center Outages, Monitoring, Power and Cooling
    2. Policy:

      Cap and Trade, Carbon Footprint, Carbon Reduction Commitment, Carbon Tax, Emissions
    3. Power:

      Biomass, Fossil Fuel, Fuel Cell, Geothermal, Hydro, Nuclear, Solar, Wind
    4. Application:

      Cloud Computing, Grid Computing
    5. Technology:

      Microblogging, Networking, Servers, Storage, Supercomputer
  2. Popular Articles