970 resultados para optimal machining parameters
Resumo:
Designing an efficient sampling strategy is of crucial importance for habitat suitability modelling. This paper compares four such strategies, namely, 'random', 'regular', 'proportional-stratified' and 'equal -stratified'- to investigate (1) how they affect prediction accuracy and (2) how sensitive they are to sample size. In order to compare them, a virtual species approach (Ecol. Model. 145 (2001) 111) in a real landscape, based on reliable data, was chosen. The distribution of the virtual species was sampled 300 times using each of the four strategies in four sample sizes. The sampled data were then fed into a GLM to make two types of prediction: (1) habitat suitability and (2) presence/ absence. Comparing the predictions to the known distribution of the virtual species allows model accuracy to be assessed. Habitat suitability predictions were assessed by Pearson's correlation coefficient and presence/absence predictions by Cohen's K agreement coefficient. The results show the 'regular' and 'equal-stratified' sampling strategies to be the most accurate and most robust. We propose the following characteristics to improve sample design: (1) increase sample size, (2) prefer systematic to random sampling and (3) include environmental information in the design'
Resumo:
Preface In this thesis we study several questions related to transaction data measured at an individual level. The questions are addressed in three essays that will constitute this thesis. In the first essay we use tick-by-tick data to estimate non-parametrically the jump process of 37 big stocks traded on the Paris Stock Exchange, and of the CAC 40 index. We separate the total daily returns in three components (trading continuous, trading jump, and overnight), and we characterize each one of them. We estimate at the individual and index levels the contribution of each return component to the total daily variability. For the index, the contribution of jumps is smaller and it is compensated by the larger contribution of overnight returns. We test formally that individual stocks jump more frequently than the index, and that they do not respond independently to the arrive of news. Finally, we find that daily jumps are larger when their arrival rates are larger. At the contemporaneous level there is a strong negative correlation between the jump frequency and the trading activity measures. The second essay study the general properties of the trade- and volume-duration processes for two stocks traded on the Paris Stock Exchange. These two stocks correspond to a very illiquid stock and to a relatively liquid stock. We estimate a class of autoregressive gamma process with conditional distribution from the family of non-central gamma (up to a scale factor). This process was introduced by Gouriéroux and Jasiak and it is known as Autoregressive gamma process. We also evaluate the ability of the process to fit the data. For this purpose we use the Diebold, Gunther and Tay (1998) test; and the capacity of the model to reproduce the moments of the observed data, and the empirical serial correlation and the partial serial correlation functions. We establish that the model describes correctly the trade duration process of illiquid stocks, but have problems to adjust correctly the trade duration process of liquid stocks which present long-memory characteristics. When the model is adjusted to volume duration, it successfully fit the data. In the third essay we study the economic relevance of optimal liquidation strategies by calibrating a recent and realistic microstructure model with data from the Paris Stock Exchange. We distinguish the case of parameters which are constant through the day from time-varying ones. An optimization problem incorporating this realistic microstructure model is presented and solved. Our model endogenizes the number of trades required before the position is liquidated. A comparative static exercise demonstrates the realism of our model. We find that a sell decision taken in the morning will be liquidated by the early afternoon. If price impacts increase over the day, the liquidation will take place more rapidly.
Resumo:
The research performed a sustainability assessment of supply chains of the anchoveta (Engraulis ringens) in Peru. The corresponding fisheries lands 6.5 million t per year, of which <2% is rendered into products for direct human consumption (DHC) and 98% reduced into feed ingredients (fishmeal and fish oil, FMFO), for export. Several industries compete for the anchoveta resources, generating local and global impacts. The need for understanding these dynamics, towards sustainability-improving management and policy recommendations, determined the development of a sustainability assessment framework: 1) characterisation and modelling of the systems under study (with Life Cycle Assessment and other tools) including local aquaculture, 2) calculation of sustainability indicators (i.e. energy efficiency, nutritional value, socio-economic performances), and 3) sustainability comparison of supply chains; definition and comparison of alternative exploitation scenarios. Future exploitation scenarios were defined by combining an ecosystem and a material flow models: continuation of the status quo (Scenario 1), shift towards increased proportion of DHC production (Scenario 2), and radical reduction of the anchoveta harvest in order for other fish stocks to recover and be exploited for DHC (Scenario 3). Scenario 2 was identified as the most sustainable. Management and policy recommendations include improving of: controls for compliance with management measures, sanitary conditions for DHC, landing infrastructure for small- and medium-scale (SMS) fisheries; the development of a national refrigerated distribution chain; and the assignation of flexible tolerances for discards from different DHC processes.
Resumo:
Over the last century, numerous techniques have been developed to analyze the movement of humans while walking and running. The combined use of kinematics and kinetics methods, mainly based on high speed video analysis and forceplate, have permitted a comprehensive description of locomotion process in terms of energetics and biomechanics. While the different phases of a single gait cycle are well understood, there is an increasing interest to know how the neuro-motor system controls gait form stride to stride. Indeed, it was observed that neurodegenerative diseases and aging could impact gait stability and gait parameters steadiness. From both clinical and fundamental research perspectives, there is therefore a need to develop techniques to accurately track gait parameters stride-by-stride over a long period with minimal constraints to patients. In this context, high accuracy satellite positioning can provide an alternative tool to monitor outdoor walking. Indeed, the high-end GPS receivers provide centimeter accuracy positioning with 5-20 Hz sampling rate: this allows the stride-by-stride assessment of a number of basic gait parameters--such as walking speed, step length and step frequency--that can be tracked over several thousand consecutive strides in free-living conditions. Furthermore, long-range correlations and fractal-like pattern was observed in those time series. As compared to other classical methods, GPS seems a promising technology in the field of gait variability analysis. However, relative high complexity and expensiveness--combined with a usability which requires further improvement--remain obstacles to the full development of the GPS technology in human applications.
Resumo:
With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice.
Resumo:
We study optimal public health care rationing and private sector price responses. Consumers differ in their wealth and illness severity (defined as treatment cost). Due to a limited budget, some consumers must be rationed. Rationed consumers may purchase from a monopolistic private market. We consider two information regimes. In the first, the public supplier rations consumers according to their wealth information (means testing). In equilibrium, the public supplier must ration both rich and poor consumers. Rationing some poor consumers implements price reduction in the private market. In the second information regime, the public supplier rations consumers according to consumers' wealth and cost information. In equilibrium, consumers are allocated the good if and only if their costs are below a threshold (cost effectiveness). Rationing based on cost results in higher equilibrium consumer surplus than rationing based on wealth.
Resumo:
BACKGROUND: Colonoscopy is generally performed with the patient sedated and receiving analgesics. However, the benefit of the most often used combination of intravenous midazolam and pethidine on patient tolerance and pain and its cardiorespiratory risk have not been fully defined. METHODS: In this double-blind prospective study, 150 outpatients undergoing routine colonoscopy were randomly assigned to receive either (1) low-dose midazolam (35 micrograms/kg) and pethidine (700 micrograms/kg in 48 patients, 500 micrograms/kg in 102 patients), (2) midazolam and placebo pethidine, or (3) pethidine and placebo midazolam. RESULTS: Tolerance (visual analog scale, 0 to 100 points: 0 = excellent; 100 = unbearable) did not improve significantly more in group 1 compared with group 2 (7 points; 95% confidence interval [-2-17]) and group 3 (2 points; 95% confidence interval [-7-12]). Similarly, pain was not significantly improved in group 1 as compared with the other groups. Male gender (p < 0.001) and shorter duration of the procedure (p = 0.004), but not amnesia, were associated with better patient tolerance and less pain. Patient satisfaction was similar in all groups. Oxygen desaturation and hypotension occurred in 33% and 11%, respectively, with a similar frequency in all three groups. CONCLUSIONS: In this study, the combination of low-dose midazolam and pethidine does not improve patient tolerance and lessen pain during colonoscopy as compared with either drug given alone. When applying low-dose midazolam, oxygen desaturation and hypotension do not occur more often after combined use of both drugs. For the individual patient, sedation and analgesia should be based on the endoscopist's clinical judgement.
Resumo:
An autoregulation-oriented strategy has been proposed to guide neurocritical therapy toward the optimal cerebral perfusion pressure (CPPOPT). The influence of ventilation changes is, however, unclear. We sought to find out whether short-term moderate hypocapnia (HC) shifts the CPPOPT or affects its detection. Thirty patients with traumatic brain injury (TBI), who required sedation and mechanical ventilation, were studied during 20 min of normocapnia (5.1±0.4 kPa) and 30 min of moderate HC (4.4±3.0 kPa). Monitoring included bilateral transcranial Doppler of the middle cerebral arteries (MCA), invasive arterial blood pressure (ABP), and intracranial pressure (ICP). Mx -autoregulatory index provided a measure for the CPP responsiveness of MCA flow velocity. CPPOPT was assessed as the CPP at which autoregulation (Mx) was working with the maximal efficiency. During normocapnia, CPPOPT (left: 80.65±6.18; right: 79.11±5.84 mm Hg) was detectable in 12 of 30 patients. Moderate HC did not shift this CPPOPT but enabled its detection in another 17 patients (CPPOPT left: 83.94±14.82; right: 85.28±14.73 mm Hg). The detection of CPPOPT was achieved via significantly improved Mx-autoregulatory index and an increase of CPP mean. It appeared that short-term moderate HC augmented the detection of an optimum CPP, and may therefore usefully support CPP-guided therapy in patients with TBI.
Resumo:
Desenvolupament dels models matemàtics necessaris per a controlar de forma òptima la microxarxa existent als laboratoris del Institut de Recerca en Energia de Catalunya. Els algoritmes s'implementaran per tal de simular el comportament i posteriorment es programaran directament sobre els elements de la microxarxa per verificar el seu correcte funcionament.. Desenvolupament dels models matemàtics necessaris per a controlar de forma òptima la microxarxa existent als laboratoris del Institut de Recerca en Energia de Catalunya. Els algoritmes s'implementaran per tal de simular el comportament i posteriorment es programaran directament sobre els elements de la microxarxa per verificar el seu correcte funcionament.
Resumo:
OBJECTIVE: To evaluate the power of various parameters of the vestibulo-ocular reflex (VOR) in detecting unilateral peripheral vestibular dysfunction and in characterizing certain inner ear pathologies. STUDY DESIGN: Prospective study of consecutive ambulatory patients presenting with acute onset of peripheral vertigo and spontaneous nystagmus. SETTING: Tertiary referral center. PATIENTS: Seventy-four patients (40 females, 34 males) and 22 normal subjects (11 females, 11 males) were included in the study. Patients were classified in three main diagnoses: vestibular neuritis: 40; viral labyrinthitis: 22; Meniere's disease: 12. METHODS: The VOR function was evaluated by standard caloric and impulse rotary tests (velocity step). A mathematical model of vestibular function was used to characterize the VOR response to rotational stimulation. The diagnostic value of the different VOR parameters was assessed by uni- and multivariable logistic regression. RESULTS: In univariable analysis, caloric asymmetry emerged as the most powerful VOR parameter in identifying unilateral vestibular deficit, with a boundary limit set at 20%. In multivariable analysis, the combination of caloric asymmetry and rotational time constant asymmetry significantly improved the discriminatory power over caloric alone (p<0.0001) and produced a detection score with a correct classification of 92.4%. In discriminating labyrinthine diseases, different combinations of the VOR parameters were obtained for each diagnosis (p<0.003) supporting that the VOR characteristics differ between the three inner ear disorders. However, the clinical usefulness of these characteristics in separating the pathologies was limited. CONCLUSION: We propose a powerful logistic model combining the indices of caloric and time constant asymmetries to detect a peripheral vestibular loss, with an accuracy of 92.4%. Based on vestibular data only, the discrimination between the different inner ear diseases is statistically possible, which supports different pathophysiologic changes in labyrinthine pathologies.
Resumo:
uvby H-beta photometry has been obtained for a sample of 93 selected main sequence A stars. The purpose was to determine accurate effective temperatures, surface gravities, and absolute magnitudes for an individual determination of ages and parallaxes, which have to be included in a more extensive work analyzing the kinematic properties of A V stars. Several calibrations and methods to determine the above mentioned parameters have been reviewed, allowing the design of a new algorithm for their determination. The results obtained using this procedure were tested in a previous paper using uvby H-beta data from the Hauck and Mermilliod catalogue, and comparing the rusulting temperatures, surface gravities and absolute magnitudes with empirical determinations of these parameters.
Resumo:
In the traditional actuarial risk model, if the surplus is negative, the company is ruined and has to go out of business. In this paper we distinguish between ruin (negative surplus) and bankruptcy (going out of business), where the probability of bankruptcy is a function of the level of negative surplus. The idea for this notion of bankruptcy comes from the observation that in some industries, companies can continue doing business even though they are technically ruined. Assuming that dividends can only be paid with a certain probability at each point of time, we derive closed-form formulas for the expected discounted dividends until bankruptcy under a barrier strategy. Subsequently, the optimal barrier is determined, and several explicit identities for the optimal value are found. The surplus process of the company is modeled by a Wiener process (Brownian motion).
Resumo:
Selostus: Ravihevosten jalostettavia ominaisuuksia kuvaavien kilpailumittojen perinnölliset tunnusluvut
Resumo:
Selostus: Sian kasvuominaisuuksien perinnölliset tunnusluvut arvioituna kolmannen asteen polynomifunktion avulla