Posts

Survivor Drones

Image
Courtesy: UNDP Moldova Civilian drones today are flown under very specific and safe weather and wind conditions. This is because these drones have adaptive control algorithms that are lacking. This applies to remote-controlled drones as well as autonomous ones. The direct and specific effect of various wind conditions on aircraft dynamics, performance, and stability cannot be accurately characterised as a simple mathematical model. Engineers at Caltech have developed Neural-Fly, a deep-learning method that can help drones cope with new and unknown wind conditions in real time. Existing Machine Learning methods require a huge amount of data to train the model and adapting these large models in real-time is all but impossible. To overcome this, Neural-Fly was developed with a "separation strategy". Only a few parameters of the neural network must be updated in real time and this is achieved with a new meta-learning algorithm, which pre-trains the neural network so that only the

AI helps contain Fusion Plasma

Image
  "Power of the sun in the palm of my hand" Many may remember this line from Toby Maguire's Spiderman 2 uttered by Dr. Otto Octavius. Well, the technology of the power of the sun is not far off in real-life. Usable fusion power has been dreamed of for decades and we get closer to it every year. Practical fusion reactors of today apply heat to atoms to generate fusion plasma. This plasma, when heated to the requisite temperatures (hundreds of millions of degrees), begin to cause atoms to fuse and release large amounts of energy. One day, we hope, the amount of energy used to run the reactor will be surmounted by the energy released, thus providing a clean, unlimited, eco-friendly power source, one to match the sun itself. In order to contain the fusion plasma (after all, earthly materials are not going to be capable of withstanding the heat), magnetic containment is necessary. To help control the delicate process of confinement of ultra-hot plasma, AI techniques are now be

ML Helps in "X" Particle Detection

Image
Courtesy: CERN Photolab Physicists have found evidence of mysterious particles known as "X" particles, which were first thought to form just after the Big Bang. "X" particles, called so because of their mysterious unknown inner structure, were created millionths of a second after the Big Bang. In a trillion degree sea of quarks and gluons that randomly collided, "X" particles were formed before the plasma cooled down and such stable particles as protons and neutrons were created. Today, X particles are extremely rare.  X (3872) was first discovered in 2003 by the Belle experiment, a particle collider in Japan that smashes together high-energy electrons and positrons. Within this environment, however, the rare particles decayed too quickly for scientists to examine their structure in detail.  Evidence of X particles in the quark-gluon plasma produced in the Large Hadron Collider (LHC) at CERN based near Geneva, Switzerland has now been found. The LHC's

Knots and Symmetries in Mathematics

Image
Courtesy: University of Tennessee Knoxville Hey Folks, We have an intriguing application of Machine Learning (ML) today: Knots and Symmetries in Mathematics! ML, as you're probably aware, works when we have a whole lot of data. In fact, the more data we have, the more likely we are to use ML or its more sophisticated brethren Deep Learning (DL). So without further ado, let's dive in! What are knots and symmetries in mathematics? A knot in mathematics is, as you would expect, inspired from real-life knots but isn't so exactly. Simply put a knot is a 2-D closed loop that exists in 3-D space. This concept could of course be extended to n-dimensional loops in m-dimensional space. Knots can be described mathematically in various ways. Sometimes two seemingly different descriptions are actually of the same knot. This can be determined by comparing properties of knots known as invariants . One of the questions surrounding invariants is: Are two invariants related? To answer this

Introduction

 Hello Fellow Internet User! How has your day been? If you're like me, you're thinking "Um...good or bad or something in between. It is still a great time to be alive!" Welcome to this blog!  Data Science is everywhere! Because data is everywhere! Data Science is fascinating as it is our very human yearning to know  what all of this data tells us . If not out of curiosity then to find out how those stories can help us in various endeavours. Science and Engineering can also benefit from Data Science, maybe even recursively ;-) This blog is meant to bring together a curated set of advancements on how Data Science has been used to further science and engineering endeavours. Apart from drawing your attention to the substance of these advancements, I shall add my own commentary. Should be a fun ride! Hope you enjoy it! Joie de vivre pour toi! Rahul PS: I am open to comments and suggestions. Please let me know what you think of this blog, its posts or anything else of relev

TL;DR AI Summarises Research

The single most significant starting point for a researcher in any field is the Literature Review. This is where the researcher reads through large troves of research papers in order to absorb and understand the current state of research in their chosen field. Wouldn't it be wonderful if someone (or something) could summarise these papers so that only relevant research could be chosen in order to do a deep-dive?  Researchers at the Allen Institute for Artificial Intelligence have developed just such an AI model. It summarises text from scientific papers and presents them in a few sentences in the form of TL;DR (Too Long; Didn't Read).  The AI model takes the most important parts from the Abstract, Introduction, and Conclusion section of a research paper to form the summary. Researchers first “pre-trained” the model on the English language. Then they created a SciTLDR data set of over 5,400 summaries of computer science papers. It was further trained on more than 20,000 titles o

Alzheimer's Disease Prediction

Image
Artificial Intelligence is now helping to predict the eventual onset of Alzheimer's disease in healthy people. The research conducted by IBM Research and Pfizer uses short standardised speech tests with better results than current methods.  Alzheimer's is a disease of the brain. Those afflicted by it, typically people over 60 years of age, slowly lose their ability to think, ability to recall from memory and eventually the ability to carry out simple tasks. Prediction of the onset of the disease can lead to the development of a simple, straightforward and accessible metric to help clinicians assess the risk of Alzheimer’s disease in an individual, leading to earlier intervention. There is no effective cure or prevention of this disease but the best way to delay onset and slow progression is to intervene early which may be possible one day.  AI techniques were used to train models using short language samples from the Framingham Heart Study . Sample were selected based on age-gr

Contact Me!

Name

Email *

Message *

Search Wikipedia For Anything!

Search results