Using a simple framework for random events in a predictable system, a team of computational biologists has come up with a new method of accurately modeling certain types of gene expression. The new method of using a piecewise deterministic Markov process to gene expression might inform some design principles for synthetic biologists.
According to Yen Ting Lin, an author of the study and a mathematician in the Theoretical Division and Center for Nonlinear Studies at Los Alamos National Laboratory, they developed a simplifying technique to reduce models of commonly adopted gene expression to a mathematical model because it is easier to study and simulate than former models.
Lin believed that the new model turns out to be a natural language to define the dynamics of eukaryotic gene expression. In gene expression, mathematical models for the movement of chemical reactions often assume that, since some processes are reacting at a faster span, averaging may be used as an analytical method. Recent experiments, however, suggest such fast averaging might not be true.
Lin said that single-cell experiments reveal that gene expression is “bursty” and random, a feature that emerges from slow switching between promoter conditions with different activities. Although promoters help to regulate gene expression, how they are triggered and the effect of kinetics remains elusive. The recent study revealed that oscillatory dynamics might be more robust than earlier thought, since the slow promoter kinetics may induce consistent oscillations that a fast switching promoter can’t do.
Peptidic natural products (PNPs) are amino acids groupings that can make effective antibiotics such as vancomycin and penicillin. Produced by microbes, these chemical compounds can kill off competing organisms. They exist everywhere, including in the human body and the soil.
VarQuest algorithm has been created by researchers at the University of California San Diego, Saint Petersburg State University in Russia and Carnegie Mellon University. The algorithm has given scientists a way to quickly find previously undiscovered PNPs.
According to Hosein Mohimani, an assistant professor of computational biology in CMU’s school of computer science, researchers have been looking for new PNPs since the time of Alexander Fleming, a scientist who made historic discovery of penicillin almost by accident.
Mohimani said that over the last decade, researchers have mostly been using an instrument known as a mass spectrometer to find the PNPs within potential samples of material. The process involves feeding a sample into this instrument. The instrument then sorts the present chemical compounds by assessing them very precisely.
Mohimani added that formerly, it would have taken a computer centuries to search through all of the existing data points in the world using a “brute force, ” a strategy that checks each data point, one by one, to find if it’s a PNP or not.
Unlike other instruments, VarQuest can get this same amount of information in few hours. The algorithm sorts the information in databases in a way that makes it easy to locate “fingerprints” that share patterns that researchers have already linked to PNPs.
When treating a patient suffering from a heart attack, time is of the essence. Cardiac surgeons try to fast stabilize the heart using reperfusion, a technique that returns oxygen to the heart by unblocking blood vessels with stents and balloons. Although reperfusion can stabilize the heart, such fast infusion of oxygen can injure severely depleted areas of the heart.
Last year, McDougal developed a model that predicts the response of a single heart cell to dwindling supplies of oxygen. The model evaluates the ability of a cell to keep producing Adenosine 5′-triphosphate (ATP) — a cell’s primary fuel source. The model is the first step in determining whether reperfusion techniques will aid or further damage a depleted heart.
The findings of the study were published last year in the Journal of Biological Chemistry. Forbes Dewey, McDougal’s advisor, and co-author is an emeritus professor of biological engineering and mechanical engineering.
McDougal and Dewey focused on modeling the effect of decreasing oxygen supplies on the biochemical reactions that produce ATP in a heart cell. The team discovered that even though oxygen may be very limited, cardiac cells seem to dig deep into their energy stores to sustain ATP levels and keep themselves alive.
Eventually, however, as oxygen depletes, even backup reserves shut down. Consequently, levels of ATP crash — a point of no return for an exhausted cell. Fascinatingly, McDougal observed an in-between stage, in which ATP levels of heart cell drop but have not yet crashed.
A new cancer model may explain how a common type of early-stage breast cancer referred to as ductal carcinoma in situ (DCIS) grows to a more invasive type of cancer, say scientist at The University of Texas MD Anderson Cancer Center.
The research offers new insight into how ductal carcinoma in situ leads to invasive ductal carcinoma (IDC). The study provides a comprehensive of why some of these types of cancer go undetected. The findings were published on January 4th on the online issue of Cell. To make the discovery possible, the researchers used a new analytical tool known as topographic single-cell sequencing (TSCS).
According to Nicholas Navin, Ph.D., associate professor of Genetics, although DCIS is the most common type of early-stage breast cancer and is mostly detected during mammography, at least 10 percent of this form of cancer progresses to invasive ductal carcinoma. Exactly how DCIS invasion happens is still poorly understood because of several technical challenges in tissue analysis.
Navin’s team discovered that genome evolution happens in the ducts before clones of cancer can be spread by “breaking through” thin tissue layer called the basement membrane. The researchers found that many cancer cell clones move from the ducts into adjacent areas to form invasive tumors.
To arrive at their findings, the researchers used exome sequencing and applied topographic single cell sequencing to 1,293 single cells. The cells came from 10 patients with both IDC and DCIS.
Last year, Research and Markets announced that the global computational biology market is expected to grow at a compound annual growth rate (CAGR) of around 21.7 percent over the next decade to reach about $11.43 billion by 2025. The fastest growing region is the Asia Pacific, a region that is expected to account for large computational biology market growth with a CAGR of about 29 percent during the forecast period.
Some of the leading companies that influence the global computational biology market are Chemical Computing Group Inc. (Canada), Certara (U.S.), Compugen Ltd. (Israel), Insilico Biotechnology AG (Germany) etc. Joint ventures, product launches, mergers, and acquisitions are a few of the important strategies adopted by the top market players to gain competitive advantage.
North America is the largest market for computational biology market. Major factors that boost the growth of the market in the region are the growing innovations and investments for research and development of technologies of disease modeling, drug discovery and modernization in the current biological computation methods.
The largest application of computational biology is mostly found in research, development, and academics. Computational biology finds application in several on-going R&D projects for drug discovery, disease modeling and anatomical visualizations in European and North American academic institutions such as computational systems biology of cancer by Institute Curie of France, regulation of metabolic networks by EMBL and R&D in computational biology by Harvard Medical School.
Individual tumors respond differently to cancer therapy. Until now, it continued to be a mystery why tumors react differently to the exact same therapy. However, a new study at the USC Viterbi School of Engineering has found that tumor growth properties influence reactions to cancer drugs.
According to Stacey Finley, a co-author to the study and USC assistant professor of Biomedical Engineering, identifying a quantity or measurement that predicts how the tumor responds, known as a predictive biomarker, is valuable to cancer research. Finley is also a faculty member of the new USC Michelson Centre for Convergent Bioscience.
Tumours exploit a biological process known as angiogenesis (the formation of new blood vessels from pre-existing vessels). In order to multiply and grow, tumors get nutrients supplied by this new vasculature. However, the growth of tumor will slow down if proteins such as vascular endothelial growth factor, an angiogenesis promoter, are stopped.
The researchers used a computational model of tumor-bearing mice to examine the response to VEGF-inhibiting treatments and how this reaction is affected by the development of a tumor. The model showed that certain tumor growth properties help predict whether drug therapy can thwart tumor expansion.
Researchers from Stony Brook University, Carnegie Mellon University, and Dana-Farber Cancer Institute discovered a new computational method that can make the analysis of gene expression more accurate. Gene expression analysis is increasingly used to monitor and diagnose cancers and is mainly used for basic biological research.
The researchers said their method, known as Salmon, can eliminate technical errors that are known to occur during RNA-seq, or RNA sequencing, the leading method for assessing gene expression. In addition, it operates at a higher speed– a critical factor as these tests are becoming more numerous.
The report was published online March 6 by the journal Nature Methods. According to Carl Kingsford, an associate professor in CMU’s Computational Biology Department, the Salmon method is freely available online. The method has already been downloaded by thousands of users.
Salomon offers a richer model of the RNA sequencing experiment and of the likely biases that occur during sequencing. This is important because this technique is used for categorizing diseases and understanding gene expression changes when tracking the progression of cancer.
The researchers named the method after Salon, a fish famous for swimming upstream because it uses an algorithm that estimates the effect of biases and gene expression level as experimental data streams by.
Early this year, a team of researchers from Northwestern University developed a computational model that performs at levels of human on a standard intelligence test. The work remains a significant step toward making AI systems that understand the world as humans.
According to Northwestern Engineering’s Ken Forbus, the model performed in the 75th percentile for adults. The problems that are difficult for humans are also difficult for models, giving additional evidence that the operation of the model capture some important characteristics of human cognition.
The computational model is built on CogSketch, an AI platform previously created in Forbus’ lab. The platform solves visual problems and understands sketches before giving immediate and interactive feedback. The platform also integrated computational model of analogy, centred on Professor Dedre Gentner’s structure-mapping theory.
The model was developed by Forbus, Walter P. Murphy Professor of Electrical Engineering and Computer Science at Northwestern’s McCormick School of Engineering, and Andrew Lovett, a former Northwestern postdoctoral researcher in psychology. The research was published in the journal Psychological Review.
One of the hallmarks of human intelligence is the ability to solve visual problems. Developing AI systems that have this ability does not only potentially shrink the gap between human cognition and computer, but also provides new evidence for the significance of analogy and symbolic representations in visual reasoning.
Despite social convections and artificial lightning, the dynamics of daylight still guide the people’s daily activities in urban environments, according to a recent study published in PLOS Computational Biology.
Humans, like many other organisms, have an internal biological clock. The clock helps them to adapt to various environmental cues, such as darkness and light. People are also guided by a social clock of daily activities including schooling, leisure, and work.
Led by Daniel, researchers from Monsivais of Aalto University School of Science, Finland, investigated how daily human activities are influenced by biological and social clocks. The researchers employed a technique called “reality mining.” In this technique, human activity patterns are inferred by analyzing the use of wireless devices.
The researchers took anonymous records of times of calls over the course of 12 months for about one million mobile users. A user’s wake/sleep cycle was inferred by recording daily times their calling activity commenced and ceased.
After the analysis of call records, the researchers discovered that sunset and sunrise still guide the start and end of people’s daily activities. Over a period of one year, changes in daily routines corresponded to seasonal changes in the timing of sunrise and sunset.
Millions of people worldwide deal with sudden, recurrent seizures that are epilepsy’s hallmark. Treatments with surgery or medication does not work for all patients, therefore, researchers have been examining a potential alternative known as cooling. This method includes implanting a device in the brain to subdue the electrical signals that characterize epileptic seizures.
Using computer simulations techniques, researchers have recently gained insights into the mechanism by which decreasing the temperature of certain brain areas could possibly treat epileptic seizures. The findings were published in PLOS Computational Biology.
In the new study, researchers at Institute of Science and Technology (NAIST), Japan, sought to know the mechanism by which focal cooling works. The technique so far has been tested only provisionally in epilepsy patients as intraoperative studies. Although it has shown constant success in rats, focal cooling sometimes increases the epileptic discharges frequency in rats.
To examine how focal cooling subdues epileptic discharges, the researchers took a computational method. They used a model of the rat brain that enabled them to simulate various mechanisms that underlie the effects of a focal cooling device.
Further laboratory studies and investigation could help the scientists improve their model and better understand the devices that underpin focal cooling.