Wednesday, September 7, 2011

2011 Abstract - eHeart: ECG Signal Processing System for Automatic Detection of Cardiac Abnormalities

eHeart: ECG Signal Processing System for Automatic Detection of Cardiac Abnormalities

Computational and processing power of electronic systems is increasing exponentially, validating the prophetic Moore’s Law day by day, and thus becoming ubiquitous in daily life. This is enabling the migration of advanced signal processing capabilities—previously restricted to very high-end systems and selective communities—into affordable, accessible, and portable devices, starting from personal computers all the way to smart phones. Processing of biological signals, which used to depend upon dedicated and advanced equipment, now becomes possible to be harnessed by these smart devices.

Since the advancement of the string galvanometer in 1903 by Willem Einthoven, the Electrocardiogram (ECG or EKG) has become one of the most predominant medical tools to diagnose cardiac problems through the interpretation of electrical activity in the heart. The ECG waveform is characterized by specific deflections, conventionally identified as P, Q, R, S, T, and their attributes viz. duration, frequency, amplitude, etc., which correlate to specific electro-muscular activities of the heart. Deviations of these attributes from the ‘norm’ indicate potential cardiac abnormalities.

eHeart is a signal processing system developed to automatically analyze ECG waveforms for specific attributes, compare different parameters with the ‘normal’ values, and map deviations to potential abnormalities. ECG data files digitally store the signal values as a function of time vs. electrical value, sampled at a high frequency (256 to 512 samples/second). In general, these signals also contain ‘noise’ due to body activities as well as electronic equipment, which is initially removed/reduced by appropriate filtering methods. Then, the filtered waveform is analyzed to identify the extrema using peak detection algorithms and threshold comparison to identify the inflection points. Statistical methods are applied to compute specified ECG parameters. Finally, these parameters are compared with normal values and deviations are mapped to known cardiac abnormalities.

eHeart is developed using MATLAB, a mathematical software. Normal & known abnormal ECG data, from simulation and real-life medical databases, is used to test and validate this project.

Looking forward, such advanced capabilities will soon become ubiquitous on everyone’s portable accessories, revolutionizing medical care to the next level, including personal medical alerts, remote monitoring, and instant feedback.

Friday, December 31, 2010

2010 Abstract - Asynchronous Sensor Cloud Movement Tracking and Prediction - Applying Statistical Filtering Methods

Abstract

Asynchronous Sensor Cloud Movement Tracking and Prediction - Applying Statistical Filtering Methods

With the advent of affordable and dependable location-aware remote sensors, it is now possible to track ‘clouds’ of data points e.g. oil spills, migrating animal herds, chemical contaminants, wildfires, biohazard clouds, meteorological movements, etc. Many sensors are dispersed among the ‘cloud’, and the position data transmitted periodically by the sensors is received and processed by a central tracking system. Advanced software algorithms process this data in real-time to track the ‘cloud’ and predict its future position.

Due to inherent measurement errors in all sensors, the aggregate cloud data is prone to inaccuracies. In addition, the sensors are asynchronous in nature. Classical statistical filtering methods are applied in this project order to reduce the error.

A computer software program consisting of a Sensor Cloud Simulator (SCS) and a Tracking and Prediction System (TPS) is developed to run this experiment:

SCS simulates the behavior of a real-world sensor cloud i.e. position, movement, and related errors. TPS receives the periodic sensor reports and correlates to track and predict their movement. Statistical filtering methods e.g. alpha beta filters, Kalman filters, reduce the prediction error.

The prediction error variance of the aggregate sensor cloud parameters is computed by correlating TPS data to SCS data and is analyzed for correlations and trends. From the numerous trials, it is evident that the statistical filtering methods are effectively able to reduce the prediction error for sensor cloud data.

Kalman filter consistently and predominantly proved to be very effective, even when applied to an intermittent group of data.

2009 Abstract - Picture Imperfect! Impact of Noise on the Efficiency of Digital Image Compression

Abstract

Picture Imperfect! Impact of Noise on the Efficiency of Digital Image Compression

Image compression has become ubiquitous in today’s technology-rich lifestyle as more and more documents and images are being created, stored, and transmitted digitally. The latter two actions are directly impacted by the size of the compressed images. Hence, studying the factors that influence the image size will help improve the compression process efficiency, resulting in substantial savings of money and time, not to mention the ecological benefits.

Redundancy of image data is the foundation principle behind image compression. In other words, the actual ‘content’ of the image, referred to as ‘signal’, is relatively sparse and/or undergoes less frequent/slower transitions. In contrast, imperfections on the image, such as specks, dust, scratches, etc., referred to as ‘noise’, are generally distributed across the whole image, and are sharper than the content data, i.e. they undergo faster transitions. Hence, noise defeats the principle behind image compression method and contributes to inefficiency in terms of size.

This project studies the impact of noise on compressed image size by superimposing synthetic noise on a set of control images and subjecting them through image compression process. The characteristics of the noise viz., density and dimensions, are manipulated and the compressed image sizes are analyzed for trends and quantitative relationships.

Even at noise levels not discernable to the naked eye, the loss of compression efficiency will be potentially high. In turn, these findings may highlight the importance and benefits of developing image clean-up technology.

2008 Abstract - MPTCS:A Model Predictive Temperature Control System for Smart Homes

MPTCS:A Model Predictive Temperature Control System for Smart Homes

In a world that is frantically going eco-friendly, any reduction of energy consumption is helpful, especially in homes. Heating systems consume a major portion of home energy consumption. Hence, improving their efficiency benefits the global environment, not just the consumer alone. Efficiency of the heating system can be increased in two areas: improving the basic energy transfer efficiency of the heater/cooler or optimizing the operation model (on/off) of the heater/cooler to meet the user needs efficiently. This project explores the latter area.

‘Smart Home’ is the growing concept in improving energy savings as well as functionality in homes. While this technology targets many areas of the home, improving heating systems is a top priority due to their high energy consumption. Some implementations involve on-site measurements and extensive customization that are time consuming and/or expensive.

MPTCS is a computer-based Temperature Control System for Smart Homes that provides optimized operation of heating/cooling systems. This software takes into consideration multiple factors such as the effective rates of the heating systems, thermal coefficients of the building, external temperature, and also the weather forecast. It incorporates Model Predictive Control (MPC) techniques to achieve the optimization goals. MPCTS uses System Identification methods to accumulate the necessary model coefficients on its own and hence doesn’t require on-site measurements. Occupancy schedule and personal temperature preferences are the only customer inputs needed by the system. While meeting the customer needs, the system optimizes the heating/cooling times to be minimal , all the time utilizing external temperatures and weather forecast data. An embedded implementation of MPTCS is feasible when there’s no “Smart Home” computer present.

Once you have the MPTCS, you’ll never again go back to a home that is too hot or too cold at the end of an exhausted day!