Diamond Annual Review 2021/22

118 119 D I A M O N D L I G H T S O U R C E A N N U A L R E V I E W 2 0 2 1 / 2 2 D I A M O N D L I G H T S O U R C E A N N U A L R E V I E W 2 0 2 1 / 2 2 Tomography, they are increasingly in demand. Post-visit data analysis services will enable users to fully remotely process and evaluate the data collected on Diamond-II. High Performance Sample Stages Some beamlines will experience greater than a hundred times increase in brightness of the photon beams. This will enable detectors to operate much faster with frame rates as high as 10 kHz. Tomake best use of the fast acquisition the sample will need to move faster as part of being scanned. This will need faster servo loops in the motion controller, and so a new generation of motor controllers will play a key role in delivering these fast applications. User Administration and Information Management Automated remote sessions will increasingly become the norm with Diamond-II. The metadata from the proposal process will need to be integrated with session allocation, sample registration and logistics, processing pipelines and the resulting visualisation and analysis of experimental results. Users will then be able to perform data mining, data analysis and gain access to enhanced search capabilities to fully exploit the value storedwithinmetadata repositories. These changes will in part be realised by greater integration of the Laboratory Information Management System and User Administration into the Data Catalogue. Figure 2: A coloured reconstruction of a bee’s compound eye, produced from studies on I13 is an example of the type of large dataset that will become increasingly common on Diamond-II. Courtesy of Gavin Taylor, Emily Baird, Andrew J Bodey & Andreas Enstrom (University of Lund). Modernisation of Data Acquisition Software Framework The beamline software architecture supports the full life-cycle of a science experiment at Diamond – from requesting beamtime and shipping samples, to planning and running experiments, to analysing and archiving results. The existing software has evolved over the last twenty years to successfully facilitate world-leading science across Diamond’s many beamlines, balancing the competing demands for flexibility and high-throughput, automated experiments. Diamond-II provides an opportunity to build on this, addressing issues that have arisen over time and enhancing functionality to allow the new regimes offered by the upgrade to be fully exploited. For the last year, work has been undertaken by the Data Acquisition, Data Analysis, Controls and SIMS groups to document the new architecture and to plan for the transition fromthe existingarchitecture. Input has been sought from the beamline scientists and the support engineers, to ensure that requirements are elicited and validated early on and that all sides are involved in this process. From the science perspective, there is a desire to improve the delivery and robustness of the software, providing timely releases for features with minimal beamline disruption. Opportunities for automation and performance, as well as better monitoring and diagnosis of issues, are also sought. From the support perspective, similar concerns are also identified. In addition, there is a need to improve maintainability and extensibility – problems increasingly of concern as the current software stack ages and technologies reach end-of-life. Figure 3: Functional View of the Target Beamline Software Architecture. A multi-year programme of work has been identified to manage the architecture upgrade. This consists of many work packages, scheduled in a way to reduce risk – with uncertainties being mitigated through the use of early prototypes. Significant items of work include: • Decomposition of the existing Generic Data Acquisition (GDA) monolithic application, into a set of well-defined, stand-alone services. Figure 3 shows Services being deployed as and when needed. • Replacement of Jython scripting (now end-of-life) with a move to Python 3. • Migration to a new experiment control library – called Bluesky – to simplify scan definition, device configuration and enable themove away fromJython. • Consolidation of the multiple different user interfaces, with the long-term goal of providing a unified web platform for the scientific and technical user interfaces. • Enhance security, through the use of Role Based Access Control. • Use of operating system level virtualisation to deliver software in packages called Containers to simplify and improve management of deployments. • Improve support for modern highly automated (Fig. 4) and adaptive experiments. Figure 4: Newworkflow for automated adaptive experiments. The architecture documentation is nearing completion and an external review is due in the first half of 2022. Initial prototyping work has begun as part of evaluating software technologies and a development roadmap defined. This work will form the basis of software developments for the new flagship beamlines for Diamond-II and upgrades to the existing beamlines. Design of New Electron BeamPosition Monitor An example of an early development undertaken for Diamond-II is the design of a New Electron Beam Position Monitor. This is a joint development by the Accelerator Controls Group in SSCC, and the Diagnostics Group in Technical Division. Positional stability of the electron beam in the Diamond-II storage ring determines the stability of the photon beam delivered to the experiment samples.With very small photon beams – as small as 30 µm× 4 µm - and long lever armdue to the length of the photon beampath then the electron beam for Diamond-II will have to be very stable – 0.9 µm× 0.12 µm - in both horizontal and vertical planes. This, combinedwith faster X-ray detectors, places increasing demands on measuring the electron beam position and its stabilisation. This requires increased performance from both the Electron Beam Position Monitors (EBPMs), used to measure the beam position, and the Fast Orbit Feedback (FOFB) system used to correct any measured disturbances (Fig. 5). The feedback update rate for Diamond-II will be increased from 10 kHz to 100 kHz, or one position update every 10 µs, and the processing delay (from beam disturbance to correction of beam) will be reduced from 700 µs to 100 µs. Achieving this requires a complete redesign of the EBPM and FOFB systems. The electron beam position is computed by measuring the signal from four RF pickups mounted in the vacuum vessel in what is called the EBPM button block. For an accurate and stable EBPMmeasurement it is necessary tomeasure this signal with exceptionally high stability, and it is therefore necessary to take into account the environmentally driven drifting of measurements between adjacent channels. Classically, this compensation is done via some form of chopper circuitry, which involves switching the received signal among the four processing channels. Unfortunately, this introduces significant switching noise which will interfere with the higher feedback rates. Therefore, a different technique is used: a pilot tone is introduced at a frequency near the button RF signal and this tone is used as a compensation mechanism. The signal processing occurs in a series of stages: the signal is processed electronically in the Diamond-II Analogue Front End (D2AFE) where the pilot tone is injected, the signal is filtered to reject frequencies more than 15 MHz away from the button RF signal, and the signal level is adjusted. The signal is then digitised at a carefully chosen frequency, close to 215 MSamples/s, before being down converted to the 100 kHz fast feedback rate. The raw button and pilot signal levels at this stage are then converted to X & Y positions before transmission to the FOFB node for feedback processing. This updated position is then transmitted over dedicated Gigabit Ethernet links to a central controller node, where a complex multidimensional controller based on an Internal Model Controller (IMC) using Generalised Singular Value Decomposition is used to compute corrector updates. Corrector magnets distributed around the ring are grouped into “fast” and “slow” correctors, with fast correctors acting at between 50 Hz and 10 kHz and slow correctors acting below 100 Hz. The central controller computes an update to each fast corrector every 10 µs, and to each slow corrector every 300 µs, and transmits these updates over Gigabit Ethernet to each corrector. Figure 5: Overview of how the electron beam position (EBPM) measurement system, x-ray beam position (XBPM) and corrector magnets are connected through signal processing and Fast Orbit Feedback controller. The system is synchronised by a global clock decoded in an Event Received (EVR) and provides a permit to the global machine protection system as an interlock (ILK). Automation of Electron Microscopy Single Particle Analysis Data Processing The Cryogenic Electron Microscopy (cryo-EM) Single Particle Analysis (SPA) pipeline line has undergone considerable work over the last year to provide users with an automated pipeline that is web based. Historically, users had to log into Diamond’s systems and interact with the datasets in a way that was non-optimal. The goal is to provide a fully automated high throughput cryo-EM service akin to the well-established pipelines available on photon beamlines for the analysis of macromolecular crystallography data. This transition is a vital step in achieving the longer-term strategic objectives of improving throughput and the ease of use. The automated service deployed was particularly important over the COVID pandemic as all user sessions had to be run remotely. Using a web interface – called Synchweb – users can start processing their data almost immediately after collection on the microscopes, with the addition of a few parameters. Once initialised the pipeline will run unsupervised, displaying several key processing results (see Fig. 6), including motion correction, distortion correction, particle picking, ice thickness estimation and 2D classification, for the users to assess. The results are updated regularly and give the users the confidence that their data is of the quality required for high resolution structure determination. The data model is implemented in collaboration with other European facilities such that metadata collected and processed is available for wider collaboration across the community. This capability allows our users to work towards adopting FAIR (Findable, Accessible, Interoperable, and Reusable) data policies over time. A unique part of the pipeline is the ice thickness estimation, that is performed by bespoke in-house software, which is an important parameter in the determination of high resolutions structures. https://doi.org/10.1016/j.str.2022.01.005. In the coming months, additional functionality will be added to the SPA pipeline to increase the information content available to the users and identify performance bottlenecks in the analyse pipelines to improve the processing times to solution. The next development will be a second processing pipeline for cryo-tomography data collections. An agreement is already in place for an updated data model so that appropriate pipelines can be developed and optimised to provide our users with a new capability. Fig 6: A web-based user interface of the data processing pipeline giving a summary of the results and status of the motion correction stage of data processing.

RkJQdWJsaXNoZXIy OTk3MjMx