Connect with us

Industry

Automation and AI add to labs like never before

The advantage of automation is that instruments are more powerful, can aggregate data more quickly, and can find analytical insights that otherwise would not be found

Throughout 2021, the healthcare industry saw an acceleration of digital change, while the pandemic pushed the pharmaceutical industry to embrace collaboration. Both of these developments will be critical to success in emerging markets during the coming 12 months. Many major advances will be made possible through cross-company and cross-discipline collaboration, as well as exchange of pre-competitive data.

Driven by macro geopolitical trends and Big Tech, emerging technologies are being developed increasingly rapidly. Reflecting this, the industry is set to see a rise in deal-making and activity – particularly in quantum computing, AI and machine learning, and autonomous laboratories.

Major tech players, to thrive in 2022, will increase their focus on life sciences and will play an important role in developing new products and initiatives that will, ultimately, benefit patients. Efficiency in R&D is on an exponential growth path as more pharma and biotech organizations partner with AI and robotics companies, enabling a more automated drug-discovery process.

Lab automation has moved beyond liquid handling robotics to include a variety of technologies that help lab technicians do more, and do so more effectively. Smart instruments are behind this, incorporating multiple features, storing data in the cloud, and sometimes sharing that data with other instruments downstream. Often, cloud-based analysis is performed automatically so that lab professionals see the results and their interpretation simultaneously.

These capabilities are iterative. As labs accept cloud computing, the Internet of Things (IoT) becomes more useful and more prevalent. Artificial intelligence (AI) and its subset, machine learning (ML), become more practical as more data is collected and stored in widely accessible compute clouds. In fact, AI and ML are leading trends within the life-science labs today.

The advantage of automation is that instruments can be more powerful, aggregate data more quickly, and find analytical insights that otherwise would not be found. Using such devices – sometimes even utilizing the IoT, which places sensors at the edge of your network – as part of a bigger solution, really empowers labs to solve problems they could not otherwise solve.

For example, taking output from one device as input for the next, like using the lab information management system (LIMS) to track patient data, pooling that data in the cloud, and using machine learning to detect patterns, gives insights (often in a few hours) that otherwise would take weeks of manual work.

AI is becoming an important element in laboratory analysis, yet there is a misconception around automation and AI. It is not necessary to have both. They are not dependent on each other.

Automation is for efficiency. It is used to streamline processes or to transfer data and make it more widely available, either from multiple locations or devices for your own team or to collaborated teams. In contrast, AI uses the data collected from instruments to identify patterns, such as analyzing video to understand exactly how flies beat their wings during flight or counting neuromuscular junctures in mice. ML takes this a step further by making predictions (such as predicting a disease prognosis based upon certain conditions). Singly or combined, these technologies can eliminate a lot of tedious, manual work.

Many lab managers think AI has a high learning curve, so there is inertia, particularly for small and medium labs. Lab managers also have to gauge whether the cost and time savings are enough to justify adding AI to the lab. For many, the answer is maybe not.

The advantage of automation is that instruments can be more powerful, aggregate data more quickly, and find analytical insights that otherwise would not be found.

Yet, automation and AI can add a lot to labs that need it. The development of fully automated next-generation sequencing during the Covid-19 pandemic is one example. Clear Labs’ whole genome sequencing instrument, for example, combines sequencing, robotics, and cloud-based analytics. It was initially developed for food safety purposes to screen for salmonella and listeria in poultry, but the lab redirected the technology for diagnostics when the pandemic hit. With runs of 32 to 64 genomes, this is a good fit for small- to medium-sized public health labs. Automating genome sequencing streamlined the process, and sending the testing results to the cloud made it easy for labs to notify patients the next day if they tested positive for Covid-19, and to track known and emerging viral variants quickly, and better contain transmission of the virus.

In contrast, traditional, legacy, and next-generation sequencing were manual and tedious. Public health labs needed two to three days to do the prep work and start the sequencing, and another four to 10 days to identify the variants.

Another AI-enabled diagnostic uses AI to analyze breath tests. A patient breathes into a bag, the contents are analyzed using mass spectrometry, and AI interprets the results in less than a minute with PCR-like accuracy. It is like a version of computer vision. To a human, the data looks like static, so the only way to interpret it is with AI.

Advancing laboratory thinking with cloud technologies. Organizations in the life-science sector have been gradually progressing their digital transformation journeys over the past decade, but the Covid-19 pandemic sparked a period of rapid change as companies explored and implemented new digital technologies to future-proof their businesses.

With a shift to more remote working practices, technologies like video conferencing, remote access, and virtual private networks (VPNs) have leapt to the forefront of company strategies to ensure successful business practices.

One key technology within many companies’ digitization strategy is cloud-based data storage and access systems. These solutions promise improved scalability and flexibility in data management, allowing companies across a range of sizes and applications to tailor their cloud set-up according to individual needs.

However, implementing these systems in laboratories can often be challenging and may fall short of its significant potential to connect different software and hardware systems that scientists use daily.

So, what are the different cloud-deployment strategies and services available to organizations, and how can companies future-proof the digital connectivity of laboratories by building well-architected cloud networks?

When companies are looking to develop a solid cloud infrastructure, it is important to focus on how the system can be integrated into their particular organizational structure and what they want to achieve. Essentially, cloud infrastructures provide flexible solutions to scale IT systems in a way that matches a company’s growth strategy, ultimately saving costs and providing greater operational efficiencies.

Cloud systems drive a reduction in costs, primarily by shifting from a CapEx-based IT infrastructure to an OpEx system. An OpEx strategy allows companies to avoid the high costs of buying expensive hardware, facilities, and IT expertise that often become redundant after several years due to new technological advances. OpEx-based leasing strategies allow companies to only purchase the amount of storage and computing capabilities they need, providing the flexibility to adapt to their specific requirements at a given time, without the added complexity of maintaining and replacing hardware every few years.

In terms of cloud adoption, companies can avoid the one-size-fits-all technologies and cherry-pick the exact services they need to progress their digital journey, whether it be by connecting directly to the cloud, across multiple clouds or taking advantage of a multitude of services, including laboratory integration, data analysis, or business-intelligence technologies. In essence, this allows companies to adapt their cloud systems in a manner that reflects their unique business models.

The flexibility of cloud systems comes from the availability of different services that offer varying degrees of cloud-based infrastructure. There are three primary service models – Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). These models offer increasing cloud involvement, ranging from mixtures of cloud-based servers and data storage and in-house app development systems with IaaS and PaaS models, all the way to fully cloud-based packages, like SaaS.

A customer has access to a category of components, such as servers, routers, gateways, and so forth, and can choose how they want their IT systems configured. IaaS systems, for example, allow simple VPN access to secure, private data storage systems, but the customer is in charge of which hardware and operating systems they purchase. PaaS takes this one step further by pre-loading operating systems and additional middleware onto computing systems supplied to the laboratories, facilitating the integration of any applications needed by the customer.

With SaaS, the cloud is typically multi-tenant and upgrades are rolled out across all users by the cloud provider.

However, SaaS models also allow varying degrees of customization and configuration of cloud packages. For example, managed SaaS models are typically chosen by regulated customers in the GxP environments, who require validation testing for their software. Managed SaaS allows for a high level of control over the configurations, and adherence to s trict security policies that encrypt and separate data.

Along with different cloud service options, there are also different strategies available to deploy these systems in the workplace. Public cloud infrastructures, like AWS, Dropbox, and Google have been around for years and are often the default choice for many companies. They are accessible via the internet and are shared among organizations. Providers also offer single-tenant virtual private clouds, which allow users to customize the security policies they require.

Private clouds are on-premises storage solutions that are solely dedicated to one organization. These systems are the most cost-effective methods for data storage and access, but often have limitations in terms of customization and compliance with specific regulations on data security, particularly when dealing with sensitive data like patient health records.

Many companies use hybrid strategies where they maintain more sensitive datasets within their in-house systems, and other datasets within a public or virtual private cloud.

Increasingly, companies are not limited to a single public cloud provider, and more customers are leveraging different providers for different cloud services and integrating them all within a vast multi-cloud strategy. This is often the case when large businesses have sites in multiple countries, meaning they have the flexibility to retain data within a specific geographic zone to meet compliance requirements for that region, or to integrate across zones to meet the needs of their global offices.

Overall, these strategies give organizations further flexibility to tailor their cloud service packages for their company, depending on factors, such as global reach or the need for compliance with strict data security and privacy laws.

Automation in the form of AI or ML is entering many life-science labs in a variety of ways. Researchers at Virginia Tech, for example, are designing an ML algorithm to predict the mechanics of living cells. The ability to predict shape-shifting objects like cells, which change in relation to their environment, has been particularly challenging. Their work in physics-guided machine learning aims to systematically integrate the mechanics of cell motion as biological rules and physics-based model outputs to predict the movement of cells or other shape-shifting objects in dynamic physical environments.

Computational scientists at Carnegie Mellon University developed an ML algorithm to understand the intricacies of genome folding in the cell nucleus and how that affects gene expression. Analyzing microscope slides may be more mundane, but it can save labs significant work. Once you have the stained slides, digitization enables AI to perform the analysis. The benefits are high throughput and greater accuracy.

A similar concept is applied to reviewing filmed experiments. Researchers at Case Western Reserve University are using ML to track the movement of flies’ wings to determine how the positioning of their halters – hard mechanosensory organs that flap out of synchronization with the wings – affects flight.

The ML algorithm measures the positions of the wings, the halteres, and their angles during flight by learning when their perspectives change. This application frees researchers from watching and manually recording differences from tens of thousands of frames of flies in flight.

Automation options are increasing beyond liquid handling to include the processing and analysis of massive quantities of data, and AI and ML are increasingly helping scientists extract value that otherwise could not be achieved quickly or, often, at all.

Importantly, these trends are not just for large, complex labs. There are now instruments and applications available that allow even small and medium-sized labs to benefit.

There is no doubt, despite the global challenges posed by the pandemic, thriving life-sciences community will continue to move forward. Automation will get a boost from the increasing willingness among industry to collaborate to innovate.

Copyright © 2024 Medical Buyer

error: Content is protected !!