Research IT facilities and infrastructure
Conduct cutting-edge research with our specialised infrastructure and facilities.
- Database services
Structured Query Language (SQL) databases for querying, updating and managing data. - Web publishing and development tools
Tools for creating, maintaining, and managing websites at the University. - Software for research
Recommended, risk-assessed and GDPR-compliant software for your research projects.
National and global facilities
You have access to a wide range of specialist infrastructure beyond York. The UK uses a tiered system to group high-performance facilities.
The difference in tier is crucial for the development of computational research and training, providing a comprehensive HPC infrastructure that supports a wide range of scientific and engineering challenges.
- Tier 1 serves as the primary national resource for large-scale HPC needs.
- Tier 2 provides specialised, intermediate-level resources that fill the capability gap between university-level HPC and national facilities.
- ARCHER2
The UK national supercomputing service offers a capability resource to allow researchers to run simulations and calculations that require large numbers of processing cores working in a tightly-coupled, parallel fashion.- HPE Cray EX supercomputing system with an estimated peak performance of 28 Pflop/s.
- The machine has 5,860 compute nodes, each with dual AMD EPYCTM 7742 64-core processors at 2.25GHz, giving 750,080 cores in total.
- ARCHER2 should be capable, on average, of over eleven times the science throughput of its predecessor, ARCHER.
- Find out how to access Archer2
- JADE II
The University of York is a partner of the JADE II facility. Anyone involved in AI/ML research and related data science applications and in need of GPUs for scaling up experiments can apply for an account. JADE II harnesses the capabilities of the NVIDIA DGX MAX-Q Deep Learning System and comprises 63 servers, each containing 8 NVIDIA Tesla V100 GPUs linked by NVIDIA’s NV link interconnect technology. JADE II uses environment modules and SLURM, similar to Viking.- Complete a form to apply for an account. (Once completed, you'll be emailed instructions for registering for JADE 2 at STFC.)
- Join in community discussion via the University's Jade II users Slack channel.
- Bede at Durham
An EPSRC Tier-2 facility. Bede primarily comprises 32 IBM Power 9 dual-CPU nodes, each with 4 NVIDIA V100 GPUs and high performance interconnect. This is the same architecture as the US government’s SUMMIT and SIERRA supercomputers, which occupied the top two places in a recently published list of the world’s fastest supercomputers.- The University does not currently have RSE support for Bede. For support, please use the Bede Slack channel.
- Cirrus at EPCC
An EPSRC Tier-2 HPC facility. The main resource is a 10,080 core SGI/HPE ICE XA system. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available. - Isambard at GW4
An EPSRC Tier-2 HPC facility. Isambard provides multiple advanced architectures within the same system in order to enable evaluation and comparison across a diverse range of hardware platforms. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available. - Cambridge Service for Data Driven Discovery (CSD3)
An EPSRC Tier-2 HPC facility. CSD3 is a multi-institution service underpinned by an innovative, petascale, data-centric HPC platform, designed specifically to drive data-intensive simulation and high-performance data analysis. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available. - MMM Hub (Materials and Molecular Modelling Hub)
The theory and simulation of materials is one of the most thriving and vibrant areas of modern scientific research today. Designed specifically for the materials and molecular modelling community, this Tier 2 supercomputing facility is available to HPC users all over the UK. The MMM Hub was established in 2016 with a £4m EPSRC grant awarded to collaborators The Thomas Young Centre (TYC), and the Science and Engineering South Consortium (SES). The MMM Hub is led by University College London on behalf of the eight collaborative partners who sit within the TYC and SES: Imperial, King’s, QMUL, Oxford, Southampton, Kent, Belfast and Cambridge. - DiRAC
The STFC HPC facility for particle physics and astronomy researchers. It is currently made up of five different systems with different architectures. These range from an extreme scaling IBM BG/Q system, a large SGI/HPE UV SMP system, and a number of Intel Xeon multicore HPC systems. Free access is available to academic researchers working in the STFC domain; academic researchers from other domains can purchase access. Industry access is also available.
Some facilities around the world may also be accessible to UK users. The list below includes facilities or organisations that can provide access to users from the UK.
- PRACE
A pan-European high-performance computing (HPC) infrastructure through which UK users can get access to some of the largest HPC systems in Europe. - DOE INCITE
The US Department of Energy makes access to its Leadership Computing facilities available to users worldwide through the INCITE programme.
Training and support
If you’re interested in Research IT resources but unsure what you need, we’re here to help.
Research IT support is for every member of staff and student who needs it. Find out about our one-to-one support, training and networking opportunities.