967 resultados para Capture-recapture Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical optical motion capture allows us to obtain kinematic and kinetic outcome measures that aid clinicians in diagnosing and treating different pathologies affecting healthy gait. The long term aim for gait centres is for subject-specific analyses that can predict, prevent, or reverse the effects of pathologies through gait retraining. To track the body, anatomical segment coordinate systems are commonly created by applying markers to the surface of the skin over specific, bony anatomy that is manually palpated. The location and placement of these markers is subjective and precision errors of up to 25mm have been reported [1]. Additionally, the selection of which anatomical landmarks to use in segment models can result in large angular differences; for example angular differences in the trunk can range up to 53o for the same motion depending on marker placement [2]. These errors can result in erroneous kinematic outcomes that either diminish or increase the apparent effects of a treatment or pathology compared to healthy data. Our goal was to improve the accuracy and precision of optical motion capture outcome measures. This thesis describes two separate studies. In the first study we aimed to establish an approach that would allow us to independently quantify the error among trunk models. Using this approach we determined if there was a best model to accurately track trunk motion. In the second study we designed a device to improve precision for test, re-test protocols that would also reduce the set-up time for motion capture experiments. Our method to compare a kinematically derived centre of mass velocity to one that was derived kinetically was successful in quantifying error among trunk models. Our findings indicate that models that use lateral shoulder markers as well as limit the translational degrees of freedom of the trunk through shared pelvic markers result in the least amount of error for the tasks we studied. We also successfully reduced intra- and inter-operator anatomical marker placement errors using a marker alignment device. The improved accuracy and precision resulting from the methods established in this thesis may lead to increased sensitivity to changes in kinematics, and ultimately result in more consistent treatment outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on the experimental characterisation of laser-driven ion beams using a Thomson Parabola Spectrometer (TPS) equipped with trapezoidally shaped electric plates, proposed by Gwynne et al. [Rev. Sci. Instrum. 85, 033304 (2014)]. While a pair of extended (30 cm long) electric plates was able to produce a significant increase in the separation between neighbouring ion species at high energies, deploying a trapezoidal design circumvented the spectral clipping at the low energy end of the ion spectra. The shape of the electric plate was chosen carefully considering, for the given spectrometer configuration, the range of detectable ion energies and species. Analytical tracing of the ion parabolas matches closely with the experimental data, which suggests a minimal effect of fringe fields on the escaping ions close to the wedged edge of the electrode. The analytical formulae were derived considering the relativistic correction required for the high energy ions to be characterised using such spectrometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in communication, navigation and imaging technologies are expected to fundamentally change methods currently used to collect data. Electronic data interchange strategies will also minimize data handling and automatically update files at the point of capture. This report summarizes the outcome of using a multi-camera platform as a method to collect roadway inventory data. It defines basic system requirements as expressed by users, who applied these techniques and examines how the application of the technology met those needs. A sign inventory case study was used to determine the advantages of creating and maintaining the database and provides the capability to monitor performance criteria for a Safety Management System. The project identified at least 75 percent of the data elements needed for a sign inventory can be gathered by viewing a high resolution image.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developing a theoretical framework for pervasive information environments is an enormous goal. This paper aims to provide a small step towards such a goal. The following pages report on our initial investigations to devise a framework that will continue to support locative, experiential and evaluative data from ‘user feedback’ in an increasingly pervasive information environment. We loosely attempt to outline this framework by developing a methodology capable of moving from rapid-deployment of software and hardware technologies, towards a goal of realistic immersive experience of pervasive information. We propose various technical solutions and address a range of problems such as; information capture through a novel model of sensing, processing, visualization and cognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research, undertaken in a mangrove swamp in northeastern Brazil (Mamanguape River Estuary), examined the factors that led to the overwhelming acceptance of the tangle-netting technique by crab harvesters in detriment to the now illegal tamping technique. Both techniques are the only ones currently used at our study site and in many other areas in Brazil, despite being prohibited by law. Data were collected through direct observations to determine capture efficiency, productivity, daily production, selectivity, and harvesting effort, and through interviews with crab harvesters, focusing on their perceptions of the capture techniques, the conditions of crab stocks and the sales price of a dozen crabs. Our results indicated that the two capture techniques did not significantly differ in terms of their efficiency or productivity, but daily production rates differed significantly, being greater using tangle-netting. The tangle-netting permits a greater harvesting effort (6 hours and 34 min) compared to tamping (4 hours and 19 min). Tangle-netting is also less selective than tamping indicated by the larger number of captured smaller specimens, including females. This results in a lower average sales price for a dozen crabs caught by tangle-netting (US$ 0.95) compared to tamping (US$ 1.02). The greater daily production of crab harvesters using the tangle-netting technique nevertheless increased their net gain, explaining their preference for this method, Given that tangle-netting results in greater harvesting pressure but lower selectivity compared to tamping, it may potentially be less sustainable. All of the crab harvesters interviewed having more than 20 years of experience (n = 34) stated they perceived that stocks of U. cordatus had become reduced over the last 20 years, together with average crab sizes. It is now important to examine the structure of the local U. cordatus population and to assess its fishery to allow evaluating whether the illegal, but prominent tangle-netting and tamping mangrove crab capture techniques are sustainable or not. We further suggest improving the dialogue between decision makers and fishermen, which barely exists to date, to initiate a discussion about possible ways of resolving the current situation of illegality of the fishermen. This will be key to achieving effective sustainable co-management of this important natural mangrove forest resource.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer and cardio-vascular diseases are the leading causes of death world-wide. Caused by systemic genetic and molecular disruptions in cells, these disorders are the manifestation of profound disturbance of normal cellular homeostasis. People suffering or at high risk for these disorders need early diagnosis and personalized therapeutic intervention. Successful implementation of such clinical measures can significantly improve global health. However, development of effective therapies is hindered by the challenges in identifying genetic and molecular determinants of the onset of diseases; and in cases where therapies already exist, the main challenge is to identify molecular determinants that drive resistance to the therapies. Due to the progress in sequencing technologies, the access to a large genome-wide biological data is now extended far beyond few experimental labs to the global research community. The unprecedented availability of the data has revolutionized the capabilities of computational researchers, enabling them to collaboratively address the long standing problems from many different perspectives. Likewise, this thesis tackles the two main public health related challenges using data driven approaches. Numerous association studies have been proposed to identify genomic variants that determine disease. However, their clinical utility remains limited due to their inability to distinguish causal variants from associated variants. In the presented thesis, we first propose a simple scheme that improves association studies in supervised fashion and has shown its applicability in identifying genomic regulatory variants associated with hypertension. Next, we propose a coupled Bayesian regression approach -- eQTeL, which leverages epigenetic data to estimate regulatory and gene interaction potential, and identifies combinations of regulatory genomic variants that explain the gene expression variance. On human heart data, eQTeL not only explains a significantly greater proportion of expression variance in samples, but also predicts gene expression more accurately than other methods. We demonstrate that eQTeL accurately detects causal regulatory SNPs by simulation, particularly those with small effect sizes. Using various functional data, we show that SNPs detected by eQTeL are enriched for allele-specific protein binding and histone modifications, which potentially disrupt binding of core cardiac transcription factors and are spatially proximal to their target. eQTeL SNPs capture a substantial proportion of genetic determinants of expression variance and we estimate that 58% of these SNPs are putatively causal. The challenge of identifying molecular determinants of cancer resistance so far could only be dealt with labor intensive and costly experimental studies, and in case of experimental drugs such studies are infeasible. Here we take a fundamentally different data driven approach to understand the evolving landscape of emerging resistance. We introduce a novel class of genetic interactions termed synthetic rescues (SR) in cancer, which denotes a functional interaction between two genes where a change in the activity of one vulnerable gene (which may be a target of a cancer drug) is lethal, but subsequently altered activity of its partner rescuer gene restores cell viability. Next we describe a comprehensive computational framework --termed INCISOR-- for identifying SR underlying cancer resistance. Applying INCISOR to mine The Cancer Genome Atlas (TCGA), a large collection of cancer patient data, we identified the first pan-cancer SR networks, composed of interactions common to many cancer types. We experimentally test and validate a subset of these interactions involving the master regulator gene mTOR. We find that rescuer genes become increasingly activated as breast cancer progresses, testifying to pervasive ongoing rescue processes. We show that SRs can be utilized to successfully predict patients' survival and response to the majority of current cancer drugs, and importantly, for predicting the emergence of drug resistance from the initial tumor biopsy. Our analysis suggests a potential new strategy for enhancing the effectiveness of existing cancer therapies by targeting their rescuer genes to counteract resistance. The thesis provides statistical frameworks that can harness ever increasing high throughput genomic data to address challenges in determining the molecular underpinnings of hypertension, cardiovascular disease and cancer resistance. We discover novel molecular mechanistic insights that will advance the progress in early disease prevention and personalized therapeutics. Our analyses sheds light on the fundamental biological understanding of gene regulation and interaction, and opens up exciting avenues of translational applications in risk prediction and therapeutics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause on going disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). Methods: the intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. Results: the final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. Conclusions: the MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Datacenters have emerged as the dominant form of computing infrastructure over the last two decades. The tremendous increase in the requirements of data analysis has led to a proportional increase in power consumption and datacenters are now one of the fastest growing electricity consumers in the United States. Another rising concern is the loss of throughput due to network congestion. Scheduling models that do not explicitly account for data placement may lead to a transfer of large amounts of data over the network causing unacceptable delays. In this dissertation, we study different scheduling models that are inspired by the dual objectives of minimizing energy costs and network congestion in a datacenter. As datacenters are equipped to handle peak workloads, the average server utilization in most datacenters is very low. As a result, one can achieve huge energy savings by selectively shutting down machines when demand is low. In this dissertation, we introduce the network-aware machine activation problem to find a schedule that simultaneously minimizes the number of machines necessary and the congestion incurred in the network. Our model significantly generalizes well-studied combinatorial optimization problems such as hard-capacitated hypergraph covering and is thus strongly NP-hard. As a result, we focus on finding good approximation algorithms. Data-parallel computation frameworks such as MapReduce have popularized the design of applications that require a large amount of communication between different machines. Efficient scheduling of these communication demands is essential to guarantee efficient execution of the different applications. In the second part of the thesis, we study the approximability of the co-flow scheduling problem that has been recently introduced to capture these application-level demands. Finally, we also study the question, "In what order should one process jobs?'' Often, precedence constraints specify a partial order over the set of jobs and the objective is to find suitable schedules that satisfy the partial order. However, in the presence of hard deadline constraints, it may be impossible to find a schedule that satisfies all precedence constraints. In this thesis we formalize different variants of job scheduling with soft precedence constraints and conduct the first systematic study of these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Throughout the last years technologic improvements have enabled internet users to analyze and retrieve data regarding Internet searches. In several fields of study this data has been used. Some authors have been using search engine query data to forecast economic variables, to detect influenza areas or to demonstrate that it is possible to capture some patterns in stock markets indexes. In this paper one investment strategy is presented using Google Trends’ weekly query data from major global stock market indexes’ constituents. The results suggest that it is indeed possible to achieve higher Info Sharpe ratios, especially for the major European stock market indexes in comparison to those provided by a buy-and-hold strategy for the period considered.