970 resultados para Pos data processing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A method is presented for determining the time to first division of individual bacterial cells growing on agar media. Bacteria were inoculated onto agar-coated slides and viewed by phase-contrast microscopy. Digital images of the growing bacteria were captured at intervals and the time to first division estimated by calculating the "box area ratio". This is the area of the smallest rectangle that can be drawn around an object, divided by the area of the object itself. The box area ratios of cells were found to increase suddenly during growth at a time that correlated with cell division as estimated by visual inspection of the digital images. This was caused by a change in the orientation of the two daughter cells that occurred when sufficient flexibility arose at their point of attachment. This method was used successfully to generate lag time distributions for populations of Escherichia coli, Listeria monocytogenes and Pseudomonas aeruginosa, but did not work with the coccoid organism Staphylococcus aureus. This method provides an objective measure of the time to first cell division, whilst automation of the data processing allows a large number of cells to be examined per experiment. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

External interferences can severely degrade the performance of an Over-the-horizon radar (OTHR), so suppression of external interferences in strong clutter environment is the prerequisite for the target detection. The traditional suppression solutions usually began with clutter suppression in either time or frequency domain, followed by the interference detection and suppression. Based on this traditional solution, this paper proposes a method characterized by joint clutter suppression and interference detection: by analyzing eigenvalues in a short-time moving window centered at different time position, Clutter is suppressed by discarding the maximum three eigenvalues at every time position and meanwhile detection is achieved by analyzing the remained eigenvalues at different position. Then, restoration is achieved by forward-backward linear prediction using interference-free data surrounding the interference position. In the numeric computation, the eigenvalue decomposition (EVD) is replaced by values decomposition (SVD) based on the equivalence of these two processing. Data processing and experimental results show its efficiency of noise floor falling down about 10-20 dB.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mounted on the sides of two widely separated spacecraft, the two Heliospheric Imager (HI) instruments onboard NASA’s STEREO mission view, for the first time, the space between the Sun and Earth. These instruments are wide-angle visible-light imagers that incorporate sufficient baffling to eliminate scattered light to the extent that the passage of solar coronal mass ejections (CMEs) through the heliosphere can be detected. Each HI instrument comprises two cameras, HI-1 and HI-2, which have 20° and 70° fields of view and are off-pointed from the Sun direction by 14.0° and 53.7°, respectively, with their optical axes aligned in the ecliptic plane. This arrangement provides coverage over solar elongation angles from 4.0° to 88.7° at the viewpoints of the two spacecraft, thereby allowing the observation of Earth-directed CMEs along the Sun – Earth line to the vicinity of the Earth and beyond. Given the two separated platforms, this also presents the first opportunity to view the structure and evolution of CMEs in three dimensions. The STEREO spacecraft were launched from Cape Canaveral Air Force Base in late October 2006, and the HI instruments have been performing scientific observations since early 2007. The design, development, manufacture, and calibration of these unique instruments are reviewed in this paper. Mission operations, including the initial commissioning phase and the science operations phase, are described. Data processing and analysis procedures are briefly discussed, and ground-test results and in-orbit observations are used to demonstrate that the performance of the instruments meets the original scientific requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research paper reports the findings from an international survey of fieldwork practitioners on their use of technology to enhance fieldwork teaching and learning. It was found that there was high information technology usage before and after time in the field, but some were also using portable devices such as smartphones and global positioning system whilst out in the field. The main pedagogic reasons cited for the use of technology were the need for efficient data processing and to develop students' technological skills. The influencing factors and barriers to the use of technology as well as the importance of emerging technologies are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.