856 resultados para System analysis - Data processing
Resumo:
Accurately and reliably identifying the actual number of clusters present with a dataset of gene expression profiles, when no additional information on cluster structure is available, is a problem addressed by few algorithms. GeneMCL transforms microarray analysis data into a graph consisting of nodes connected by edges, where the nodes represent genes, and the edges represent the similarity in expression of those genes, as given by a proximity measurement. This measurement is taken to be the Pearson correlation coefficient combined with a local non-linear rescaling step. The resulting graph is input to the Markov Cluster (MCL) algorithm, which is an elegant, deterministic, non-specific and scalable method, which models stochastic flow through the graph. The algorithm is inherently affected by any cluster structure present, and rapidly decomposes a graph into cohesive clusters. The potential of the GeneMCL algorithm is demonstrated with a 5730 gene subset (IGS) of the Van't Veer breast cancer database, for which the clusterings are shown to reflect underlying biological mechanisms. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
We present a kinetic model for transformations between different self-assembled lipid structures. The model shows how data on the rates of phase transitions between mesophases of different geometries can be used to provide information on the mechanisms of the transformations and the transition states involved. This can be used, for example, to gain an insight into intermediate structures in cell membrane fission or fusion. In cases where the monolayer curvature changes on going from the initial to the final mesophase, we consider the phase transition to be driven primarily by the change in the relaxed curvature with pressure or temperature, which alters the relative curvature elastic energies of the two mesophase structures. Using this model, we have analyzed previously published kinetic data on the inter-conversion of inverse bicontinuous cubic phases in the 1-monoolein-30 wt% water system. The data are for a transition between QII(G) and QII(D) phases, and our analysis indicates that the transition state more closely resembles the QII(D) than the QII(G) phase. Using estimated values for the monolayer mean curvatures of the QII(G) and QII(D) phases of -0.123 nm(-1) and -0.133 nm(-1), respectively, gives values for the monolayer mean curvature of the transition state of between -0.131 nm(-1) and -0.132 nm(-1). Furthermore, we estimate that several thousand molecules undergo the phase transition cooperatively within one "cooperative unit", equivalent to 1-2 unit cells of QII(G) or 4-10 unit cells of QII(D).
Resumo:
This article introduces a quantitative approach to e-commerce system evaluation based on the theory of process simulation. The general concept of e-commerce system simulation is presented based on the considerations of some limitations in e-commerce system development such as the huge amount of initial investments of time and money, and the long period from business planning to system development, then to system test and operation, and finally to exact return; in other words, currently used system analysis and development method cannot tell investors about some keen attentions such as how good their e-commerce system could be, how many investment repayments they could have, and which area they should improve regarding the initial business plan. In order to exam the value and its potential effects of an e-commerce business plan, it is necessary to use a quantitative evaluation approach and the authors of this article believe that process simulation is an appropriate option. The overall objective of this article is to apply the theory of process simulation to e-commerce system evaluation, and the authors will achieve this though an experimental study on a business plan for online construction and demolition waste exchange. The methodologies adopted in this article include literature review, system analysis and development, simulation modelling and analysis, and case study. The results from this article include the concept of e-commerce system simulation, a comprehensive review of simulation methods adopted in e-commerce system evaluation, and a real case study of applying simulation to e-commerce system evaluation. Furthermore, the authors hope that the adoption and implementation of the process simulation approach can effectively support business decision-making, and improve the efficiency of e-commerce systems.
Resumo:
Emergency vehicles use high-amplitude sirens to warn pedestrians and other road users of their presence. Unfortunately, the siren noise enters the vehicle and corrupts the intelligibility of two-way radio voice com-munications from the emergency vehicle to a control room. Often the siren has to be turned off to enable the control room to hear what is being said which subsequently endangers people's lives. A digital signal processing (DSP) based system for the cancellation of siren noise embedded within speech is presented. The system has been tested with the least mean square (LMS), normalised least mean square (NLMS) and affine projection algorithm (APA) using recordings from three common types of sirens (two-tone, wail and yelp) from actual test vehicles. It was found that the APA with a projection order of 2 gives comparably improved cancellation over the LMS and NLMS with only a moderate increase in algorithm complexity and code size. Therefore, this siren noise cancellation system using the APA offers an improvement in cancellation achieved by previous systems. The removal of the siren noise improves the response time for the emergency vehicle and thus the system can contribute to saving lives. The system also allows voice communication to take place even when the siren is on and as such the vehicle offers less risk of danger when moving at high speeds in heavy traffic.
Resumo:
Mounted on the sides of two widely separated spacecraft, the two Heliospheric Imager (HI) instruments onboard NASA’s STEREO mission view, for the first time, the space between the Sun and Earth. These instruments are wide-angle visible-light imagers that incorporate sufficient baffling to eliminate scattered light to the extent that the passage of solar coronal mass ejections (CMEs) through the heliosphere can be detected. Each HI instrument comprises two cameras, HI-1 and HI-2, which have 20° and 70° fields of view and are off-pointed from the Sun direction by 14.0° and 53.7°, respectively, with their optical axes aligned in the ecliptic plane. This arrangement provides coverage over solar elongation angles from 4.0° to 88.7° at the viewpoints of the two spacecraft, thereby allowing the observation of Earth-directed CMEs along the Sun – Earth line to the vicinity of the Earth and beyond. Given the two separated platforms, this also presents the first opportunity to view the structure and evolution of CMEs in three dimensions. The STEREO spacecraft were launched from Cape Canaveral Air Force Base in late October 2006, and the HI instruments have been performing scientific observations since early 2007. The design, development, manufacture, and calibration of these unique instruments are reviewed in this paper. Mission operations, including the initial commissioning phase and the science operations phase, are described. Data processing and analysis procedures are briefly discussed, and ground-test results and in-orbit observations are used to demonstrate that the performance of the instruments meets the original scientific requirements.
Resumo:
Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.
Resumo:
Burst timing synchronisation is maintained in a digital data decoder during multiple burst reception in a TDMA system. The data within a multiple burst are streamed into memory storage and data corresponding to a first burst in the series of bursts are selected on the basis of a current timing estimate derived from a synchronisation burst. Selections of data corresponding to other bursts in the series of bursts are modified in accordance with updated timing estimates derived from previously processed bursts.
Resumo:
Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.
Resumo:
This research paper reports the findings from an international survey of fieldwork practitioners on their use of technology to enhance fieldwork teaching and learning. It was found that there was high information technology usage before and after time in the field, but some were also using portable devices such as smartphones and global positioning system whilst out in the field. The main pedagogic reasons cited for the use of technology were the need for efficient data processing and to develop students' technological skills. The influencing factors and barriers to the use of technology as well as the importance of emerging technologies are discussed.
Resumo:
The two-way relationship between Rossby Wave-Breaking (RWB) and intensification of extra tropical cyclones is analysed over the Euro-Atlantic sector. In particular, the timing, intensity and location of cyclone development are related to RWB occurrences. For this purpose, two potential-temperature based indices are used to detect and classify anticyclonic and cyclonic RWB episodes from ERA-40 Re-Analysis data. Results show that explosive cyclogenesis over the North Atlantic (NA) is fostered by enhanced occurrence of RWB on days prior to the cyclone’s maximum intensification. Under such conditions, the eddy-driven jet stream is accelerated over the NA, thus enhancing conditions for cyclogenesis. For explosive cyclogenesis over the eastern NA, enhanced cyclonic RWB over eastern Greenland and anticyclonic RWB over the sub-tropical NA are observed. Typically only one of these is present in any given case, with the RWB over eastern Greenland being more frequent than its southern counterpart. This leads to an intensification of the jet over the eastern NA and enhanced probability of windstorms reaching Western Europe. Explosive cyclones evolving under simultaneous RWB on both sides of the jet feature a higher mean intensity and deepening rates than cyclones preceded by a single RWB event. Explosive developments over the western NA are typically linked to a single area of enhanced cyclonic RWB over western Greenland. Here, the eddy-driven jet is accelerated over the western NA. Enhanced occurrence of cyclonic RWB over southern Greenland and anticyclonic RWB over Europe is also observed after explosive cyclogenesis, potentially leading to the onset of Scandinavian Blocking. However, only very intense developments have a considerable influence on the large-scale atmospheric flow. Non-explosive cyclones depict no sign of enhanced RWB over the whole NA area. We conclude that the links between RWB and cyclogenesis over the Euro-Atlantic sector are sensitive to the cyclone’s maximum intensity, deepening rate and location.
Resumo:
Neurovascular coupling in response to stimulation of the rat barrel cortex was investigated using concurrent multichannel electrophysiology and laser Doppler flowmetry. The data were used to build a linear dynamic model relating neural activity to blood flow. Local field potential time series were subject to current source density analysis, and the time series of a layer IV sink of the barrel cortex was used as the input to the model. The model output was the time series of the changes in regional cerebral blood flow (CBF). We show that this model can provide excellent fit of the CBF responses for stimulus durations of up to 16 s. The structure of the model consisted of two coupled components representing vascular dilation and constriction. The complex temporal characteristics of the CBF time series were reproduced by the relatively simple balance of these two components. We show that the impulse response obtained under the 16-s duration stimulation condition generalised to provide a good prediction to the data from the shorter duration stimulation conditions. Furthermore, by optimising three out of the total of nine model parameters, the variability in the data can be well accounted for over a wide range of stimulus conditions. By establishing linearity, classic system analysis methods can be used to generate and explore a range of equivalent model structures (e.g., feed-forward or feedback) to guide the experimental investigation of the control of vascular dilation and constriction following stimulation.
Resumo:
This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.
Resumo:
Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV
Resumo:
Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.
Resumo:
SOA (Service Oriented Architecture), workflow, the Semantic Web, and Grid computing are key enabling information technologies in the development of increasingly sophisticated e-Science infrastructures and application platforms. While the emergence of Cloud computing as a new computing paradigm has provided new directions and opportunities for e-Science infrastructure development, it also presents some challenges. Scientific research is increasingly finding that it is difficult to handle “big data” using traditional data processing techniques. Such challenges demonstrate the need for a comprehensive analysis on using the above mentioned informatics techniques to develop appropriate e-Science infrastructure and platforms in the context of Cloud computing. This survey paper describes recent research advances in applying informatics techniques to facilitate scientific research particularly from the Cloud computing perspective. Our particular contributions include identifying associated research challenges and opportunities, presenting lessons learned, and describing our future vision for applying Cloud computing to e-Science. We believe our research findings can help indicate the future trend of e-Science, and can inform funding and research directions in how to more appropriately employ computing technologies in scientific research. We point out the open research issues hoping to spark new development and innovation in the e-Science field.