Our client is a multinational healthcare company that needed to improve operational efficiency in their lab. All this time, they had a few legacy systems on which they would perform a manual intervention to visualize the lab workflow. In an effort to automate this process, they sought help from the Google Cloud experts at Persistent.
Visualizing the lab workflow was time-consuming
The client was experiencing difficulty in running experiments on the current infrastructure. The process was painfullly slow yet quite expensive. Moreover, the reanalysis of an experiment was not effective. A single experiment would take two weeks and 20 scientists. They wanted to make this entire process of experimenting more efficient and less expensive.
Having worked with us in the past, the client had a pretty good understanding of our delivery expertise. For this solution as well, they decided to seek help from Persistent’s Google Cloud team.
Automating lab workflow management to improve operational efficiency
The need for improving the overall efficiency of the client’s processes involved automating the lab workflow management. We chalked out a detailed plan to help them overcome their challenges and reduce the overall cost of operations.
Our five person on-shore and 20 person off-shore team created a platform with a dashboard to show the live status at every stage of each experiment. We also applied data warehousing techniques to manage all experimental data and organized regular QBRs to track progress and align with customer expectations.
To successfully implement the solution, Persistent’s team utilized some of the best Google Cloud Platform tools:
- GCP Foundation
- Organization setup, projects, billing, identity, etc.
- Google SSO for authorization and authentication
- Connection to Source and Ingestion to GCP
- Automated the process of connection and fetching real-time data from the sequencer pipelines to GCP
- Cloud Storage
- Cloud Storage for storing huge unstructured raw data, coming from pipelines.
- Compute Engines
- On-demand data processing
Reducing costs by leveraging Cloud Storage and BigQuery
After the solution was implemented, the client was able to gain insights into real-time experiment status and used it to analyze genetic data in real time. The solution also helped the client reduced the analysis cost per experiment by around 60%.
Moreover, the client experienced a remarkable improvement in decision-making and was able to perform 10k experiments per week, unlike 80 experiments per week before.
With Persistent’s on-demand data processing, the client optimized their costs by learning how to effectively use Cloud Storage and Big Query. According to the client, we are their most steady and reliable partner. They have also involved us in a number of applications now including Deep Learning, Chip Tracker, and Web DVT.
The client’s product department is replicating our application into their sequencer. They are taking it from Diagnostics to a platform that they can productize for visualization of the pipeline.