|Title: Proposing a New Method for Query Processing Adaption in Data Base|
|Author(s): Mohammad_Reza Feizi_Derakhshi, Hasan Asil, Amir Asil|
|Pages: 1-4||Paper ID: 191091-IJECS-IJENS||Published: October, 2009|
Abstract: This paper proposes a multi agent system by compiling two technologies, query processing optimization and agents which contains features of personalized queries and adaptive with changing of requirements. This system uses a new algorithm based on modeling of users' long-term requirements and also GA for gathering users' queries data. Results of executing show that in comparison to classic algorithms, the presented algorithm is more powerful in the field of adapting.
|Keywords: Query Processing – Genetic Algorithm – Adapting System – Multi agents systems.|
|Full Text (.pdf) | 252 KB|
|Title: Baseline Wandering Removal from Human Electrocardiogram Signal using Projection Pursuit Gradient Ascent Algorithm|
|Author(s): Zahoor-uddin, Farooq Alam Orakzai|
|Pages: 5-7||Paper ID: 192091-IJECS-IJENS||Published: October, 2009|
Abstract: Baseline noise removal from electrocardiogram (ECG) signal is a blind source separation problem. Various noises affect the measured ECG signal. Major ECG noises are baseline noise, electrode contact noise, muscle noise, instrument noise. Baseline noise distorts the low frequency segment of ECG signal. The low frequency segment is s-t segment. This segment is very important and has the information related to heart attack. People apply various algorithms to remove this noise from noisy ECG signal. We have applied projection pursuit gradient ascent algorithm to remove this noise from the measured ECG signal. This algorithm separates the independent signals from a mixture of signals. Efficient removal of baseline noise might give us certain information that are hidden from the doctors until now which may save the life of a person. Results for different baseline noise signals were analyzed. Different signal from MIT-BIH database were also analyzed for error in term of standard deviation and mean of error signal. Finally we did a comparative study of the results of different algorithms like kalman filter, cubic spline and moving average algorithms and showed that projection pursuit is the efficient one.
|Keywords: Baseline Noise, Cubic spline, Electrocardiogram, heart attack, Kalman filter, projection pursuit.|
|Full Text (.pdf) | 157 KB|
|Title: Leveraged Assortment of Microsoft Technologies: Marching Onwards in the Coming Battle Field of Grid Computing|
|Author(s): Muhammad Mujahid Iqbal|
|Pages: 8-13||Paper ID: 190391-IJECS-IJENS||Published: October, 2009|
Abstract: The availability of Powerful microprocessors and high-speed networks as commodity components has enabled high-performance computing on distributed systems (wide-area cluster computing) to be more communally adoptable. In this environment, as the resources are usually distributed geographically at various levels, there is a great challenge in integrating, coordinating and present them as a resources to the user; thus forming a distributed grid. The reactive strategy to high price of supercomputers; disabling researchers accomplish many research projects like weather forecasting, nuclear simulations, bio-informatics and all those applications that require high-performance computing. So far, various grid-enabled paradigms are developed to carry out large-scale problems in various fields of science, engineering and commerce. Observing the rapid emergence of the grid computing and increasing popularity of the software development platform based on Microsoft’s .NET Framework, a Windows-based grid computing is particularly important from the software industry’s perspective. To leverage the combined power of .NET framework, SQL server and Windows, a phase-shifted grid paradigm is adapted that can harness the power of Microsoft Technologies.
|Keywords: NET framework, Middleware.|
|Full Text (.pdf) | 157 KB|
|Title: FPGA Based Implementation of Baseline JPEG Decoder|
|Author(s): Jahanzeb Ahmad, Mansoor Ebrahim|
|Pages: 14-19||Paper ID: 1916091-IJECS-IJENS||Published: October, 2009|
Abstract: The JPEG standard (ISO/ IEC 10918-1 ITU-T Recommendation T.81) defines compression techniques for image data. As a consequence, it allows to store and transfer image data with considerably reduced demand for storage space and bandwidth. From the four processes provided in the JPEG standard, only one, the baseline process is widely used. In this paper FPGA based High speed, low complexity and low memory implementation of JPEG decoder is presented. The pipeline implementation of the system, allow decompressing multiple image blocks simultaneously. The hardware decoder is designed to operate at 100MHz on Altera Cyclon II or Xilinx Spartan 3E FPGA or equivalent. The decoder is capable of decoding Baseline JPEG color and gray images. Decoder is also capable of downscaling the image by 8. The decoder is designed to meet industrial needs. JFIF, DCF and EXIF standers are implemented in the design.
|Keywords: JPEG Decoder|
|Full Text (.pdf) | 248 KB|
|Title: CT Compensation of Numerical Distance Relaying Algorithm|
|Author(s): Abdullah Assuhaimi Mohd Zin, Nur ‘Ain Maiza Ismail, Zaniah Muda, Mohamad Jalalian|
|Pages: 20-25||Paper ID: 1917091-IJECS-IJENS||Published: October, 2009|
Abstract: In this paper a prototype algorithm for Numerical Distance relay is developed in order to prevent mal-operation of relays when Current Transformer (CT) saturation occurs. Saturation of CT causes errors in reproduction of the current fundamental harmonic. The design of CT(s), which never saturate would end in bulky and expensive units. Therefore most of the protective CTs that are in service saturate during severe transients. Distorted secondary current due to CT saturation is detected and compensated by the algorithm in order to obtain correct operation of Distance relay in saturation area. Third-difference function and Auto Regressive (AR) model are employed in developing the saturation detection and compensation algorithm. The algorithm is developed using C++ language. Then the performance of the algorithm is evaluated through simulation of case studies in Alternative Transient Program (ATP) simulator. Finally, the Numerical Distance Relaying algorithm with CT saturation compensation is successfully developed.
|Keywords: Numerical Distance Relay; CT saturation; CT compensation; ATP|
|Full Text (.pdf) | 248 KB|
|Title: Performance Comparison of Wavelet Packet Modulation and OFDM Over Multipath Wireless Channel with Narrowband Interference|
|Author(s): Usman Khan, Sobia Baig, M. Junaid Mughal|
|Pages: 26-29||Paper ID: 1944091-IJECS-IJENS||Published: October, 2009|
Abstract: Orthogonal Frequency Division Multiplexing (OFDM) suffers from high side lobes in transmitted signal which enhance its sensitivity to inter-carrier interference (ICI) and narrowband interference (NBI). In literature, Wavelet Packet Modulation (WPM) has been proposed as an alternative to OFDM. WPM offers much lower side lobes in transmitted signal, which reduce its sensitivity to ICI and NBI. This paper presents a performance comparison of OFDM and WPM over multipath wireless channel with NBI. WPM utilizes a time domain minimum mean square error (MMSE) equalizer for the equalization of multipath wireless channel. It is shown that WPM employing a time domain MMSE equalizer exhibits higher immunity to NBI compared to an OFDM system, over multipath wireless channel.
|Keywords: Wavelet Packet Modulation, narrowband interference (NBI), MMSE equalization.|
|Full Text (.pdf) | 240 KB|
|Title: Cost Effective Dual Band Short Backfire Antenna|
|Author(s): M. Javid Asad, M. Zafrullah|
|Pages: 30-36||Paper ID: 1949091-IJECS-IJENS||Published: October, 2009|
Abstract: Short backfire antennas have been developed with single impedance bandwidth characteristics. In this paper, a new cross dipole excitation structure for short backfire antenna is presented that achieves dual impedance band width characteristics. The cross dipole excitation structure consists of H-shaped and straight cross dipoles. The cross dipole structure is mounted on the slotted coaxial line. The antenna is developed on elliptical ground plane with dielectric rim. A cost effective design is proposed to get metallic rim from the dielectric rim. The pertinent features of this short backfire antenna are (i) dual impedance band width (ii) reconfigurable circular polarization (iii) cost effective fabrication. It is demonstrated that this short backfire antenna can achieve voltage standing wave ratio (VSWR) bandwidths of 21% and 20.36% for 2:1 VSWR, axial ratio of 1.8 dB with axial ratio (= 3dB) bandwidth of 1.94% and a gain of 12 dBi. The antenna structure is described and simulation and experimental results are presented.
|Keywords: Dual impedance band, circularly polarized, short backfire antenna.|
|Full Text (.pdf) | 284 KB|
|Title: GUI Based Mamdani Fuzzy Inference System Modeling To Predict Surface Roughness in Laser Machining|
|Author(s): Sivarao, Peter Brevern, N. S. M. El-Tayeb, V. C. Vengkatesh|
|Pages: 37-43||Paper ID: 1957091-IJECS-IJENS||Published: October, 2009|
Abstract: The world of manufacturing has shifted its level to the era of space age machining. The purpose of this investigation is to develop Fuzzy based Graphical User Interface (GUI) for modeling of laser machining conditions. The developed fuzzy based GUI is expected to overcome the major problems faced by most of the manufacturing industries nowadays with the increased number controllable parameters and the lack of expertise to operate the machine. Investigations were carried out by screening for the significant parameters before the explicit GUI is designed. Next, the GUI for Fuzzy based modeling has been developed using GUIDE and Fuzzy Toolbox in MATLAB. The fuzzy variables were also analyzed before finalizing the significant of its variables. The developed GUI has been programmed to interact with fuzzy variables in order to model the laser processing cut quality of two different thicknesses, 2.5 and 5.0 mm. The models were then compared for their statistical validation by Root Mean Square Error (RMSE). Few models with best and optimized variables were taken as prediction models, where their respective outputs were analyzed and compared based on percentage error for 128 data sets to validate the models. The best developed model was then recommended to the pressure vessel manufacturing industry to further reduce the production cost and improve cut quality of its end product.
|Keywords: Mamdani Fuzzy modeling, Laser Cutting, laser cut quality evaluation, GUI based modeling.|
|Full Text (.pdf) | 690 KB|
|Title: Development of Tree-bank Based Probabilistic Grammar for Urdu Language|
|Author(s): Qaiser Abbas, Nayyara Karamat, Sadia Niazi|
|Pages: 44-47||Paper ID: 1959091-IJECS-IJENS||Published: October, 2009|
Abstract: The process includes in hand tagged corpus, tree annotation on paper for large corpus, NU-FAST Treebank in form of brackets, extraction of CFG through NU-FAST Treebank, evaluation of PCFG from CFG and then PDCG from PCFG for inspection/testing through PROLOG parser.
|Keywords: Urdu Language, Probabilistic Grammar|
|Full Text (.pdf) | 231 KB|
|Title: An Evolutionary Artifact Rejection Method For Brain Computer Interface Using ICA|
|Author(s): A. Asadi Ghanbari, M. R. Nazari Kousarrizi, M. Teshnehlab, M. Aliyari|
|Pages: 48-53||Paper ID: 1962091-IJECS-IJENS||Published: October, 2009|
Abstract: It is widely accepted in the brain computer interface research community that neurological phenomena are the only source of control in any BCI system. Artifacts are undesirable signals that can interfere with neurological phenomena. They may change the characteristics of neurological phenomena or even be mistakenly used as the source of control in BCI systems. Independent component analysis is a method that blindly separates mixtures of independent source signals, forcing the components to be independent. It has been widely applied to remove artifacts from EEG signals. Preliminary studies have shown that ICA increases the strength of motor-related signal components in the Mu rhythms, and is thus useful for removing artifacts in BCI systems. Genetic algorithm is a type of randomized search strategy. The applicability of GAs to the optimum feature subset selection problem is obvious, and there has been considerable interest in this area in the last decade. In this paper, genetic algorithms are applied to optimum Independent component selection, and select a subset of ICs contain the best neurological phenomena suited for BCI system. We introduce novel automatic artifact removal method by means of ICA and Genetic algorithm.
|Keywords: Artifact; Brain computer interface; Genetic Algorithm and Independent Component Analysis.|
|Full Text (.pdf) | 433 KB|
|Title: Fault Detection of The Tennessee Eastman Process Using Improved PCA and Neural Classifier|
|Author(s): Mostafa Noruzi Nashalji, Mahdi Aliyari Shoorehdeli, Mohammad Teshnehlab|
|Pages: 54-59||Paper ID: 1967091-IJECS-IJENS||Published: October, 2009|
Abstract: This paper describes hybrid multivariate method: Principal Component Analysis improved by Genetic Algorithm .This method determines main Principle Components can be used to detect fault during the operation of industrial process by neural classifier. This technique is applied to simulated data collected from the Tennessee Eastman chemical plant simulator which was designed to simulate a wide variety of faults occurring in a chemical plant based on a facility at Eastman chemical.
|Keywords: Artificial neural network, Fault detection, Genetic algorithm, Multi layer perceptron, Principal component analysis, Tennessee eastman process.|
|Full Text (.pdf) | 271 KB|
|Title: Machining Quality Predictions: Comparative Analysis of Neural Network and Fuzzy Logic|
|Author(s): Sivarao, Castillo and Taufik|
|Pages: 60-64||Paper ID: 1958091-IJECS-IJENS||Published: October, 2009|
Abstract: Surface finish is an important objective function in manufacturing engineering. It holds the characteristic that could influence the performance of mechanical parts which is also proportional to production cost. It is also an aspect for designing mechanical elements and frequently presented as a quality and precision indicator of manufacturing processes. Various failures, sometimes catastrophic leading to high cost have been attributed to the surface finish of the components which left unanswered. Therefore, the quality of surface roughness is essential feature of drilling operation since most of hole applications are assembly works, especially focused on the relative movement and tight tolerance work. Hence, high standard quality control needs to be introduced. The aim of this experimental and analytical research is to identify the parameters which enable the prediction of surface roughness in drilling. Two expert systems were used to analyze the best fit model in predicting the output of surface roughness for this specific drill job. The prediction accuracy is then compared to analyze which model could give better results so that it can be recommended for machine learning and future work. From the findings, it is found that Sugeno Fuzzy model gives better the closest values as compared to the ANN model. Thus, the work conditions and Fuzzy environment is selected for predictions of surface roughness in drilling.
|Keywords: Sugeno Fuzzy, deep drilling, neural network, fuzzy logic, surface finish.|
|Full Text (.pdf) | 417 KB|