913 resultados para Cumulative Distribution Function
Resumo:
This work was supported by a grant from the UK Economic and Social Research Council (ES/L010437/1).
Resumo:
The dynamical evolution of dislocations in plastically deformed metals is controlled by both deterministic factors arising out of applied loads and stochastic effects appearing due to fluctuations of internal stress. Such type of stochastic dislocation processes and the associated spatially inhomogeneous modes lead to randomness in the observed deformation structure. Previous studies have analyzed the role of randomness in such textural evolution but none of these models have considered the impact of a finite decay time (all previous models assumed instantaneous relaxation which is "unphysical") of the stochastic perturbations in the overall dynamics of the system. The present article bridges this knowledge gap by introducing a colored noise in the form of an Ornstein-Uhlenbeck noise in the analysis of a class of linear and nonlinear Wiener and Ornstein-Uhlenbeck processes that these structural dislocation dynamics could be mapped on to. Based on an analysis of the relevant Fokker-Planck model, our results show that linear Wiener processes remain unaffected by the second time scale in the problem but all nonlinear processes, both Wiener type and Ornstein-Uhlenbeck type, scale as a function of the noise decay time τ. The results are expected to ramify existing experimental observations and inspire new numerical and laboratory tests to gain further insight into the competition between deterministic and random effects in modeling plastically deformed samples.
Resumo:
Conventional reliability models for parallel systems are not applicable for the analysis of parallel systems with load transfer and sharing. In this short communication, firstly, the dependent failures of parallel systems are analyzed, and the reliability model of load-sharing parallel system is presented based on Miner cumulative damage theory and the full probability formula. Secondly, the parallel system reliability is calculated by Monte Carlo simulation when the component life follows the Weibull distribution. The research result shows that the proposed reliability mathematical model could analyze and evaluate the reliability of parallel systems in the presence of load transfer.
Resumo:
Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.
For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.
Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.
Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.
In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.
For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.
Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.
Resumo:
This paper reports the findings from a study of the learning of English intonation by Spanish speakers within the discourse mode of L2 oral presentation. The purpose of this experiment is, firstly, to compare four prosodic parameters before and after an L2 discourse intonation training programme and, secondly, to confirm whether subjects, after the aforementioned L2 discourse intonation training, are able to match the form of these four prosodic parameters to the discourse-pragmatic function of dominance and control. The study designed the instructions and tasks to create the oral and written corpora and Brazil’s Pronunciation for Advanced Learners of English was adapted for the pedagogical aims of the present study. The learners’ pre- and post-tasks were acoustically analysed and a pre / post- questionnaire design was applied to interpret the acoustic analysis. Results indicate most of the subjects acquired a wider choice of the four prosodic parameters partly due to the prosodically-annotated transcripts that were developed throughout the L2 discourse intonation course. Conversely, qualitative and quantitative data reveal most subjects failed to match the forms to their appropriate pragmatic functions to express dominance and control in an L2 oral presentation.
Computer-based tools for assessing micro-longitudinal patterns of cognitive function in older adults
Resumo:
Patterns of cognitive change over micro-longitudinal timescales (i.e., ranging from hours to days) are associated with a wide range of age-related health and functional outcomes. However, practical issues with conducting high-frequency assessments make investigations of micro-longitudinal cognition costly and burdensome to run. One way of addressing this is to develop cognitive assessments that can be performed by older adults, in their own homes, without a researcher being present. Here, we address the question of whether reliable and valid cognitive data can be collected over micro-longitudinal timescales using unsupervised cognitive tests.In study 1, 48 older adults completed two touchscreen cognitive tests, on three occasions, in controlled conditions, alongside a battery of standard tests of cognitive functions. In study 2, 40 older adults completed the same two computerized tasks on multiple occasions, over three separate week-long periods, in their own homes, without a researcher present. Here, the tasks were incorporated into a wider touchscreen system (Novel Assessment of Nutrition and Ageing (NANA)) developed to assess multiple domains of health and behavior. Standard tests of cognitive function were also administered prior to participants using the NANA system.Performance on the two “NANA” cognitive tasks showed convergent validity with, and similar levels of reliability to, the standard cognitive battery in both studies. Completion and accuracy rates were also very high. These results show that reliable and valid cognitive data can be collected from older adults using unsupervised computerized tests, thus affording new opportunities for the investigation of cognitive function.
Resumo:
The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.
Resumo:
The composition and condition of membrane lipids, the morphology of erythrocytes, and hemoglobin distribution were explored with the help of laser interference microscopy (LIM) and Raman spectroscopy. It is shown that patients with cardiovascular diseases (CVD) have significant changes in the composition of their phospholipids and the fatty acids of membrane lipids. Furthermore, the microviscosity of the membranes and morphology of the erythrocytes are altered causing disordered oxygen transport by hemoglobin. Basic therapy carried out with the use of antiaggregants, statins, antianginals, beta-blockers, and calcium antagonists does not help to recover themorphofunctional properties of erythrocytes. Based on the results the authors assume that, for the relief of the ischemic crisis and further therapeutic treatment, it is necessary to include, in addition to cardiovascular disease medicines, medication that increases the ability of erythrocytes’ hemoglobin to transport oxygen to the tissues. We assume that the use of LIM and Raman spectroscopy is advisable for early diagnosis of changes in the structure and functional state of erythrocytes when cardiovascular diseases develop.
Resumo:
PURPOSE: To evaluate quality of life in Portuguese patients with Systemic Lupus Erithematosus (SLE) and its correlation with disease activity and cumulative damage. METHODS: We included consecutive SLE patients, fulfilling the 1997 ACR Classification Criteria for SLE and followed at the Rheumatology Department of the University Hospital of Coimbra, Portugal at time of visit to the outpatient clinic. Quality of life was evaluated using the patient self-assessment questionnaire Medical Outcomes Survey Short Form-36 (SF-36) (validated Portuguese version). The consulting rheumatologist fulfilled the SLE associated indexes for cumulative damage (Systemic Lupus International Collaborating Clinics- Damage Index: SLICC/ACR-DI) and disease activity (Systemic Lupus Erythematosus Disease Activity Index: SLEDAI 2000). Correlation between SLEDAI and SLICC and SF-36 was tested with the Spearman Coefficient. Significant level considered was 0.05. RESULTS: The study included 133 SLE patients (90.2% female, mean age - 40.7 years, mean disease duration - 8.7 years). Most patients presented low disease activity (mean SLEDAI = 4.23) and limited cumulative damage (mean SLICC = 0.76). Despite that, SF-36 mean scores were below 70% in all eight domains of the index. Physical function domains showed lower scores than mental function domains. The QoL in this group of patients is significantly impaired when compared with the reference Portuguese population (p<0.05 in all domains). There was no correlation between clinical activity or cumulative damage and quality of life. CONCLUSION: QoL is significantly compromised in this group of SLE patients, but not related with disease activity or damage. These findings suggest that disease activity, cumulative damage and QoL are independent outcome measures and should all be used to assess the full impact of disease in SLE patients.
Resumo:
In this study the conodont multielement apparatus of Late Devonian (Famennian) Icriodus altematus is described which has been reconstructed from clustered group findings and separated elements. This apparatus is markedly different from classical ozarkodinid apparatuses and needs further consideration of its functional morphology. Since bedding plane assemblages of Icriodus altematus are yet unknown, a spatial reconstruction of this apparatus and a feeding mechanism are proposed which are based on the oropharyngal apparatus of recent lampreys. Though the extant representatives of petromyzontoids are not close phylogenetic relatives of extinct conodonts, there exist intriguing analogies concerning the morphology of the tooth types and the presumed spatial distribution within the oral cavity of both taxa.
Resumo:
This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.
Resumo:
The trees, hedgerows and woods are current configuration of the tree network in several ecological regions of the world. In Trás–os–Montes region, Northeast of Portugal, they are a traditional component of Terra fria landscape and they could be seen in several forms: scatter trees, fencerows, small woodlots, riparian buffer strips, among others. The extensive livestock systems in this region are based on a set of circuits across the landscape. In this practice, flocks interacts with these structures using them for different functions inducing an influence on the itineraries. Our purpose will be focused on the woody features of landscape regarding their configurations, abundance and spacial distribution; in order to examine how the grazing systems depends on the currency of these formations; particularly how species flocks behaviors are related on. Depending on spatial data, The investigation attain to compare the tree network within the agriculture matrix, to the grazed territory crossed by flocks. From the other side, the importance of spatial data on interpreting the issue by suggesting different parameter that may influence the circuits. The recognition of the pressure exerciced by the occurence of the woody structures on the grazed circuits is possible. We believe that the role of these woody structures features in supporting the tradicional silvopastoral systems has been sufficiently strong for change their distribution pattern.
Resumo:
International audience
Resumo:
This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using Common Random Numbers at various levels of replications. Furthermore, after optimising our Crossdocking distribution centre simulation model, we are able to achieve optimal performance using fewer simulations runs for the simulation model which uses Common Random Numbers as compared to the simulation model which does not use Common Random Numbers.