29 resultados para Input-output analysis
Resumo:
During free walking, gait is automatically adjusted to provide optimal mechanical output and minimal energy expenditure; gait parameters, such as cadence, fluctuate from one stride to the next around average values. It was described that this fluctuation exhibited long-range correlations and fractal-like patterns. In addition, it was suggested that these long-range correlations disappeared if the participant followed the beep of metronome to regulate his or her pace. Until now, these fractal fluctuations were only observed for stride interval, because no technique existed to adequately analyze an extended time of free walking. The aim of the present study was to measure walking speed (WS), step frequency (SF) and step length (SL) with high accuracy (<1 cm) satellite positioning method (global positioning system or GPS) in order to detect long-range correlations in the stride-to-stride fluctuations. Eight participants walked 30 min under free and constrained (metronome) conditions. Under free walking conditions, DFA (detrended fluctuation analysis) and surrogate data tests showed that the fluctuation of WS, SL and SF exhibited a fractal pattern (i.e., scaling exponent alpha: 0.5 < alpha < 1) in a large majority of participants (7/8). Under constrained conditions (metronome), SF fluctuations became significantly anti-correlated (alpha < 0.5) in all participants. However, the scaling exponent of SL and WS was not modified. We conclude that, when the walking pace is controlled by an auditory signal, the feedback loop between the planned movement (at supraspinal level) and the sensory inputs induces a continual shifting of SF around the mean (persistent anti-correlation), but with no effect on the fluctuation dynamics of the other parameters (SL, WS).
Resumo:
Until recently, the hard X-ray, phase-sensitive imaging technique called grating interferometry was thought to provide information only in real space. However, by utilizing an alternative approach to data analysis we demonstrated that the angular resolved ultra-small angle X-ray scattering distribution can be retrieved from experimental data. Thus, reciprocal space information is accessible by grating interferometry in addition to real space. Naturally, the quality of the retrieved data strongly depends on the performance of the employed analysis procedure, which involves deconvolution of periodic and noisy data in this context. The aim of this article is to compare several deconvolution algorithms to retrieve the ultra-small angle X-ray scattering distribution in grating interferometry. We quantitatively compare the performance of three deconvolution procedures (i.e., Wiener, iterative Wiener and Lucy-Richardson) in case of realistically modeled, noisy and periodic input data. The simulations showed that the algorithm of Lucy-Richardson is the more reliable and more efficient as a function of the characteristics of the signals in the given context. The availability of a reliable data analysis procedure is essential for future developments in grating interferometry.
Resumo:
This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.
Resumo:
BACKGROUND: Finding genes that are differentially expressed between conditions is an integral part of understanding the molecular basis of phenotypic variation. In the past decades, DNA microarrays have been used extensively to quantify the abundance of mRNA corresponding to different genes, and more recently high-throughput sequencing of cDNA (RNA-seq) has emerged as a powerful competitor. As the cost of sequencing decreases, it is conceivable that the use of RNA-seq for differential expression analysis will increase rapidly. To exploit the possibilities and address the challenges posed by this relatively new type of data, a number of software packages have been developed especially for differential expression analysis of RNA-seq data. RESULTS: We conducted an extensive comparison of eleven methods for differential expression analysis of RNA-seq data. All methods are freely available within the R framework and take as input a matrix of counts, i.e. the number of reads mapping to each genomic feature of interest in each of a number of samples. We evaluate the methods based on both simulated data and real RNA-seq data. CONCLUSIONS: Very small sample sizes, which are still common in RNA-seq experiments, impose problems for all evaluated methods and any results obtained under such conditions should be interpreted with caution. For larger sample sizes, the methods combining a variance-stabilizing transformation with the 'limma' method for differential expression analysis perform well under many different conditions, as does the nonparametric SAMseq method.
Resumo:
Some patients infected with human immunodeficiency virus (HIV) who are experiencing antiretroviral treatment failure have persistent improvement in CD4+ T cell counts despite high plasma viremia. To explore the mechanisms responsible for this phenomenon, 2 parameters influencing the dynamics of CD4+ T cells were evaluated: death of mature CD4+ T cells and replenishment of the CD4+ T cell pool by the thymus. The improvement in CD4+ T cells observed in patients with treatment failure was not correlated with spontaneous, Fas ligand-induced, or activation-induced T cell death. In contrast, a significant correlation between the improvement in CD4+ T cell counts and thymic output, as assessed by measurement of T cell receptor excision circles, was observed. These observations suggest that increased thymic output contributes to the dissociation between CD4+ T cell counts and viremia in patients failing antiretroviral therapy and support a model in which drug-resistant HIV strains may have reduced replication rates and pathogenicity in the thymus.
Resumo:
OBJECTIVE: The measurement of cardiac output is a key element in the assessment of cardiac function. Recently, a pulse contour analysis-based device without need for calibration became available (FloTrac/Vigileo, Edwards Lifescience, Irvine, CA). This study was conducted to determine if there is an impact of the arterial catheter site and to investigate the accuracy of this system when compared with the pulmonary artery catheter using the bolus thermodilution technique (PAC). DESIGN: Prospective study. SETTING: The operating room of 1 university hospital. PARTICIPANTS: Twenty patients undergoing cardiac surgery. INTERVENTIONS: CO was determined in parallel by the use of the Flotrac/Vigileo systems in the radial and femoral position (CO_rad and CO_fem) and by PAC as the reference method. Data triplets were recorded at defined time points. The primary endpoint was the comparison of CO_rad and CO_fem, and the secondary endpoint was the comparison with the PAC. MEASUREMENTS AND MAIN RESULTS: Seventy-eight simultaneous data recordings were obtained. The Bland-Altman analysis for CO_fem and CO_rad showed a bias of 0.46 L/min, precision was 0.85 L/min, and the percentage error was 34%. The Bland-Altman analysis for CO_rad and PAC showed a bias of -0.35 L/min, the precision was 1.88 L/min, and the percentage error was 76%. The Bland-Altman analysis for CO_fem and PAC showed a bias of 0.11 L/min, the precision was 1.8 L/min, and the percentage error was 69%. CONCLUSION: The FloTrac/Vigileo system was shown to not produce exactly the same CO data when used in radial and femoral arteries, even though the percentage error was close to the clinically acceptable range. Thus, the impact of the introduction site of the arterial catheter is not negligible. The agreement with thermodilution was low.
Resumo:
Neural comparisons of bilateral sensory inputs are essential for visual depth perception and accurate localization of sounds in space. All animals, from single-cell prokaryotes to humans, orient themselves in response to environmental chemical stimuli, but the contribution of spatial integration of neural activity in olfaction remains unclear. We investigated this problem in Drosophila melanogaster larvae. Using high-resolution behavioral analysis, we studied the chemotaxis behavior of larvae with a single functional olfactory neuron on either the left or right side of the head, allowing us to examine unilateral or bilateral olfactory input. We developed new spectroscopic methods to create stable odorant gradients in which odor concentrations were experimentally measured. In these controlled environments, we observed that a single functional neuron provided sufficient information to permit larval chemotaxis. We found additional evidence that the overall accuracy of navigation is enhanced by the increase in the signal-to-noise ratio conferred by bilateral sensory input.
Resumo:
Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.
Resumo:
A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.
Resumo:
The measurement of fat balance (fat input minus fat output) involves the accurate estimation of both metabolizable fat intake and total fat oxidation. This is possible mostly under laboratory conditions and not yet in free-living conditions. In the latter situation, net fat retention/mobilization can be estimated based on precise and accurate sequential body composition measurements. In case of positive balance, lipids stored in adipose tissue can originate from dietary (exogenous) lipids or from nonlipid precursors, mainly from carbohydrates (CHOs) but also from ethanol, through a process known as de novo lipogenesis (DNL). Basic equations are provided in this review to facilitate the interpretation of the different subcomponents of fat balance (endogenous vs exogenous) under different nutritional circumstances. One difficulty is methodological: total DNL is difficult to measure quantitatively in man; for example, indirect calorimetry only tracks net DNL, not total DNL. Although the numerous factors (mostly exogenous) influencing DNL have been studied, in particular the effect of CHO overfeeding, there is little information on the rate of DNL in habitual conditions of life, that is, large day-to-day fluctuations of CHO intakes, different types of CHO ingested with different glycemic indexes, alcohol combined with excess CHO intakes, etc. Three issues, which are still controversial today, will be addressed: (1) Is the increase of fat mass induced by CHO overfeeding explained by DNL only, or by decreased endogenous fat oxidation, or both? (2) Is DNL different in overweight and obese individuals as compared to their lean counterparts? (3) Does DNL occur both in the liver and in adipose tissue? Recent studies have demonstrated that acute CHO overfeeding influences adipose tissue lipogenic gene expression and that CHO may stimulate DNL in skeletal muscles, at least in vitro. The role of DNL and its importance in health and disease remain to be further clarified, in particular the putative effect of DNL on the control of energy intake and energy expenditure, as well as the occurrence of DNL in other tissues (such as in myocytes) in addition to hepatocytes and adipocytes.
Resumo:
In vivo fetal magnetic resonance imaging provides aunique approach for the study of early human braindevelopment [1]. In utero cerebral morphometry couldpotentially be used as a marker of the cerebralmaturation and help to distinguish between normal andabnormal development in ambiguous situations. However,this quantitative approach is a major challenge becauseof the movement of the fetus inside the amniotic cavity,the poor spatial resolution provided by very fast MRIsequences and the partial volume effect. Extensiveefforts are made to deal with the reconstruction ofhigh-resolution 3D fetal volumes based on severalacquisitions with lower resolution [2,3,4]. Frameworkswere developed for the segmentation of specific regionsof the fetal brain such as posterior fossa, brainstem orgerminal matrix [5,6], or for the entire brain tissue[7,8], applying the Expectation-Maximization MarkovRandom Field (EM-MRF) framework. However, many of theseprevious works focused on the young fetus (i.e. before 24weeks) and use anatomical atlas priors to segment thedifferent tissue or regions. As most of the gyraldevelopment takes place after the 24th week, acomprehensive and clinically meaningful study of thefetal brain should not dismiss the third trimester ofgestation. To cope with the rapidly changing appearanceof the developing brain, some authors proposed a dynamicatlas [8]. To our opinion, this approach however faces arisk of circularity: each brain will be analyzed /deformed using the template of its biological age,potentially biasing the effective developmental delay.Here, we expand our previous work [9] to proposepost-processing pipeline without prior that allow acomprehensive set of morphometric measurement devoted toclinical application. Data set & Methods: Prenatal MRimaging was performed with a 1-T system (GE MedicalSystems, Milwaukee) using single shot fast spin echo(ssFSE) sequences (TR 7000 ms, TE 180 ms, FOV 40 x 40 cm,slice thickness 5.4mm, in plane spatial resolution1.09mm). For each fetus, 6 axial volumes shifted by 1 mmwere acquired under motherâeuro?s sedation (about 1min pervolume). First, each volume is segmentedsemi-automatically using region-growing algorithms toextract fetal brain from surrounding maternal tissues.Inhomogeneity intensity correction [10] and linearintensity normalization are then performed. Brain tissues(CSF, GM and WM) are then segmented based on thelow-resolution volumes as presented in [9]. Ahigh-resolution image with isotropic voxel size of 1.09mm is created as proposed in [2] and using B-splines forthe scattered data interpolation [11]. Basal gangliasegmentation is performed using a levet setimplementation on the high-resolution volume [12]. Theresulting white matter image is then binarized and givenas an input in FreeSurfer software(http://surfer.nmr.mgh.harvard.edu) to providetopologically accurate three-dimensional reconstructionsof the fetal brain according to the local intensitygradient. References: [1] Guibaud, Prenatal Diagnosis29(4) (2009). [2] Rousseau, Acad. Rad. 13(9), 2006. [3]Jiang, IEEE TMI 2007. [4] Warfield IADB, MICCAI 2009. [5]Claude, IEEE Trans. Bio. Eng. 51(4) 2004. [6] Habas,MICCAI 2008. [7] Bertelsen, ISMRM 2009. [8] Habas,Neuroimage 53(2) 2010. [9] Bach Cuadra, IADB, MICCAI2009. [10] Styner, IEEE TMI 19(39 (2000). [11] Lee, IEEETrans. Visual. And Comp. Graph. 3(3), 1997. [12] BachCuadra, ISMRM 2010.
Resumo:
A growing body of scientific literature recurrently indicates that crime and forensic intelligence influence how crime scene investigators make decisions in their practices. This study scrutinises further this intelligence-led crime scene examination view. It analyses results obtained from two questionnaires. Data have been collected from nine chiefs of Intelligence Units (IUs) and 73 Crime Scene Examiners (CSEs) working in forensic science units (FSUs) in the French speaking part of Switzerland (six cantonal police agencies). Four salient elements emerged: (1) the actual existence of communication channels between IUs and FSUs across the police agencies under consideration; (2) most CSEs take into account crime intelligence disseminated; (3) a differentiated, but significant use by CSEs in their daily practice of this kind of intelligence; (4) a probable deep influence of this kind of intelligence on the most concerned CSEs, specially in the selection of the type of material/trace to detect, collect, analyse and exploit. These results contribute to decipher the subtle dialectic articulating crime intelligence and crime scene investigation, and to express further the polymorph role of CSEs, beyond their most recognised input to the justice system. Indeed, they appear to be central, but implicit, stakeholders in intelligence-led style of policing.
Resumo:
Nitric oxide (NO) produced by inducible NO synthase (iNOS, NOS-2) is an important component of the macrophage-mediated immune defense toward numerous pathogens. Murine macrophages produce NO after cytokine activation, whereas, under similar conditions, human macrophages produce low levels or no NO at all. Although human macrophages can express iNOS mRNA and protein on activation, whether they possess the complete machinery necessary for NO synthesis remains controversial. To define the conditions necessary for human monocytes/macrophages to synthesize NO when expressing a functional iNOS, the human monocytic U937 cell line was engineered to synthesize this enzyme, following infection with a retroviral expression vector containing human hepatic iNOS (DFGiNOS). Northern blot and Western blot analysis confirmed the expression of iNOS in transfected U937 cells both at the RNA and protein levels. NOS enzymatic activity was demonstrated in cell lysates by the conversion of L-[3H]arginine into L-[3H]citrulline and the production of NO by intact cells was measured by nitrite and nitrate accumulation in culture supernatants. When expressing functional iNOS, U937 cells were capable of releasing high levels of NO. NO production was strictly dependent on supplementation of the culture medium with tetrahydrobiopterin (BH4) and was not modified by stimulation of the cells with different cytokines. These observations suggest that (1) human monocytic U937 cells contain all the cofactors necessary for NO synthesis, except BH4 and (2) the failure to detect NO in cytokine-stimulated untransfected U937 cells is not due to the presence of a NO-scavenging molecule within these cells nor to the destabilization of iNOS protein. DFGiNOS U937 cells represent a valuable human model to study the role of NO in immunity toward tumors and pathogens.
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.