Skip to end of metadata
Go to start of metadata

Table of contents

MonARCH Scheduled Quarterly Maintenance
Once a quarter, MonARCH will be offline for a day whilst maintenance is done on it.  The next scheduled maintenance day will be Wed 15th August 2018. The one after that will be Wed 21st November 2018.

Subject: A new and improved MonARCH cluster is in preparation

Dear MonARCH user,

We wish to advise you that in the next few weeks, we will be provisioning a new MonARCH cluster. The new MonARCH cluster will continue to serve the university’s HPC users as its primary community, and remain distinct and independent from MASSIVE M3. However, it will be closely aligned with M3. Specifically, the new MonARCH will feature:

  • two dedicated login nodes and a dedicated data transfer node, like on MASSIVE M3;

  • ten new nodes, adding 280 CPU cores, and each node equipped with an nVIDIA Tesla P100 ( card;

  • an updated version of the SLURM scheduler with service redundancy, better stability and new features to improve fair share;

  • a new website for MonARCH HPC user documentation; and

  • a convergence to a single HPC software module environment, shared with MASSIVE M3; this means the current MonARCH HPC software will eventually be superseded but will remain accessible while we will install/migrate the legacy software to this single HPC software module environment.

We have scheduled the release of the new MonARCH on the 23rd of October 2017. The current MonARCH will remain available for about four weeks after this release date, to ensure that your jobs will run successfully on either cluster. Rest assured, we will put every effort necessary to ease your transition to the new MonARCH cluster.

This is the culmination of a major undertaking to align the operations of MonARCH and MASSIVE into a single configuration and management framework, reducing system heterogeneity, and thus enhancing our ability to provide better HPC user support.

Further details of the new MonARCH will be made available closer to the release date. For any queries or concerns, please feel free to contact us at

The Monash HPC Team 

October 1 2017 Update:

To stay updated with the development of MonARCH v2, please visit our "work in progress" page: Work In Progress Information on MonARCH v2


MonARCH (Monash Advanced Research Computing Hybrid) is the next-generation HPC/HTC Cluster, designed from the ground up to address the emergent and future needs of the Monash HPC community.

A key feature of MonARCH is that it is provisioned through R@CMon, the Research Cloud @ Monash facility. Through the use of advanced cloud technology, MonARCH is able to configure and grow dynamically. As with any HPC cluster, MonARCH presents a single point-of-access to computational researchers to run calculations on its constituent servers.

MonARCH aims to continually develop over time. Currently, it consists of 35 servers under two complementary hardware specifications:

  • high-core servers - two Haswell CPU sockets with a total of 24 physical cores (or 48 hyperthreaded cores) at 2.80 GHz
  • high-speed servers - two Haswell CPU sockets with a total of 16 physical cores (or 32 hyperthreaded cores) at 3.20 GHz

For data storage, we have deployed a parallel file system service using Intel Enterprise Lustre; providing over 300 TB usable storage with room for future expansion.

The MonARCH service is operated by the Monash HPC team and continuing technical and operational support from the Monash Cloud team, and eSolutions Servers-and-Storage, and Networks teams.

If you have found the MonARCH useful for your research, we will be very grateful if you kindly acknowledge us with a text along the lines of:

This research was supported in part by the Monash eResearch Centre and eSolutions-Research Support Services through the use of the MonARCH HPC Cluster.


Applying for Access

MonARCH is available to all Monash Researchers. To apply for access, please visit this access page for self service instructions.  For any assistance, please email,


Blog stream

Create a blog post to share news and announcements with your team and company.




  • No labels