951 resultados para Worst-case execution-time
Resumo:
The Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease characterized by progressive muscle weakness that leads the patient to death, usually due to respiratory complications. Thus, as the disease progresses the patient will require noninvasive ventilation (NIV) and constant monitoring. This paper presents a distributed architecture for homecare monitoring of nocturnal NIV in patients with ALS. The implementation of this architecture used single board computers and mobile devices placed in patient’s homes, to display alert messages for caregivers and a web server for remote monitoring by the healthcare staff. The architecture used a software based on fuzzy logic and computer vision to capture data from a mechanical ventilator screen and generate alert messages with instructions for caregivers. The monitoring was performed on 29 patients for 7 con-tinuous hours daily during 5 days generating a total of 126000 samples for each variable monitored at a sampling rate of one sample per second. The system was evaluated regarding the rate of hits for character recognition and its correction through an algorithm for the detection and correction of errors. Furthermore, a healthcare team evaluated regarding the time intervals at which the alert messages were generated and the correctness of such messages. Thus, the system showed an average hit rate of 98.72%, and in the worst case 98.39%. As for the message to be generated, the system also agreed 100% to the overall assessment, and there was disagreement in only 2 cases with one of the physician evaluators.
Resumo:
For water depths greater than 60m floating wind turbines will become the most economical option for generating offshore wind energy. Tension mooring stabilised units are one type of platform being considered by the offshore wind energy industry. The complex mooring arrangement used by this type of platform means that the dynamics are greatly effected by offsets in the positioning of the anchors. This paper examines the issue of tendon anchor position tolerances. The dynamic effects of three positional tolerances are analysed in survival state using the time domain FASTLink. The severe impact of worst case anchor positional offsets on platform and turbine survivability is shown. The worst anchor misposition combinations are highlighted and should be strongly avoided. Novel methods to mitigate this issue are presented.
Stable carbon isotope ratios of carbon dioxide from EDC and Berkner Island ice cores for 40-50 ka BP
Resumo:
The stable carbon isotopic signature of carbon dioxide (d13CO2) measured in the air occlusions of polar ice provides important constraints on the carbon cycle in past climates. In order to exploit this information for previous glacial periods, one must use deep, clathrated ice, where the occluded air is preserved not in bubbles but in the form of air hydrates. Therefore, it must be established whether the original atmospheric d13CO2 signature can be reconstructed from clathrated ice. We present a comparative study using coeval bubbly ice from Berkner Island and ice from the bubble-clathrate transformation zone (BCTZ) of EPICA Dome C (EDC). In the EDC samples the gas is partitioned into clathrates and remaining bubbles as shown by erroneously low and scattered CO2 concentration values, presenting a worst-case test for d13CO2 reconstructions. Even so, the reconstructed atmospheric d13CO2 values show only slightly larger scatter. The difference to data from coeval bubbly ice is statistically significant. However, the 0.16 per mil magnitude of the offset is small for practical purposes, especially in light of uncertainty from non-uniform corrections for diffusion related fractionation that could contribute to the discrepancy. Our results are promising for palaeo-atmospheric studies of d13CO2 using a ball mill dry extraction technique below the BCTZ of ice cores, where gas is not subject to fractionation into microfractures and between clathrate and bubble reservoirs.
Resumo:
This paper describes a fast integer sorting algorithm, herein referred to as Bit-index sort, which does not use comparisons and is intended to sort partial permutations. Experimental results exhibit linear complexity order in execution time. Bit-index sort uses a bit-array to classify input sequences of distinct integers, and exploits built-in bit functions in C compilers, supported by machine hardware, to retrieve the ordered output sequence. Results show that Bit-index sort outperforms quicksort and counting sort algorithms when compared in their execution time. A parallel approach for Bit-index sort using two simultaneous threads is also included, which obtains further speedups of up to 1.6 compared to its sequential case.
Resumo:
A series of related research studies over 15 years assessed the effects of prawn trawling on sessile megabenthos in the Great Barrier Reef, to support management for sustainable use in the World Heritage Area. These large-scale studies estimated impacts on benthos (particularly removal rates per trawl pass), monitored subsequent recovery rates, measured natural dynamics of tagged megabenthos, mapped the regional distribution of seabed habitats and benthic species, and integrated these results in a dynamic modelling framework together with spatio-temporal fishery effort data and simulated management. Typical impact rates were between 5 and 25% per trawl, recovery times ranged from several years to several decades, and most sessile megabenthos were naturally distributed in areas where little or no trawling occurred and so had low exposure to trawling. The model simulated trawl impact and recovery on the mapped species distributions, and estimated the regional scale cumulative changes due to trawling as a time series of status for megabenthos species. The regional status of these taxa at time of greatest depletion ranged from ∼77% relative to pre-trawl abundance for the worst case species, having slow recovery with moderate exposure to trawling, to ∼97% for the least affected taxon. The model also evaluated the expected outcomes for sessile megabenthos in response to major management interventions implemented between 1999 and 2006, including closures, effort reductions, and protected areas. As a result of these interventions, all taxa were predicted to recover (by 2-14% at 2025); the most affected species having relatively greater recovery. Effort reductions made the biggest positive contributions to benthos status for all taxa, with closures making smaller contributions for some taxa. The results demonstrated that management actions have arrested and reversed previous unsustainable trends for all taxa assessed, and have led to a prawn trawl fishery with improved environmental sustainability. © 2015 International Council for the Exploration of the Sea 2015. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Resumo:
Reconfigurable HW can be used to build a hardware multitasking system where tasks can be assigned to the reconfigurable HW at run-time according to the requirements of the running applications. Normally the execution in this kind of systems is controlled by an embedded processor. In these systems tasks are frequently represented as subtask graphs, where a subtask is the basic scheduling unit that can be assigned to a reconfigurable HW. In order to control the execution of these tasks, the processor must manage at run-time complex data structures, like graphs or linked list, which may generate significant execution-time penalties. In addition, HW/SW communications are frequently a system bottleneck. Hence, it is very interesting to find a way to reduce the run-time SW computations and the HW/SW communications. To this end we have developed a HW execution manager that controls the execution of subtask graphs over a set of reconfigurable units. This manager receives as input a subtask graph coupled to a subtask schedule, and guarantees its proper execution. In addition it includes support to reduce the execution-time overhead due to reconfigurations. With this HW support the execution of task graphs can be managed efficiently generating only very small run-time penalties.
Resumo:
Over the last century, mathematical optimization has become a prominent tool for decision making. Its systematic application in practical fields such as economics, logistics or defense led to the development of algorithmic methods with ever increasing efficiency. Indeed, for a variety of real-world problems, finding an optimal decision among a set of (implicitly or explicitly) predefined alternatives has become conceivable in reasonable time. In the last decades, however, the research community raised more and more attention to the role of uncertainty in the optimization process. In particular, one may question the notion of optimality, and even feasibility, when studying decision problems with unknown or imprecise input parameters. This concern is even more critical in a world becoming more and more complex —by which we intend, interconnected —where each individual variation inside a system inevitably causes other variations in the system itself. In this dissertation, we study a class of optimization problems which suffer from imprecise input data and feature a two-stage decision process, i.e., where decisions are made in a sequential order —called stages —and where unknown parameters are revealed throughout the stages. The applications of such problems are plethora in practical fields such as, e.g., facility location problems with uncertain demands, transportation problems with uncertain costs or scheduling under uncertain processing times. The uncertainty is dealt with a robust optimization (RO) viewpoint (also known as "worst-case perspective") and we present original contributions to the RO literature on both the theoretical and practical side.
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
A two-dimensional numeric simulator is developed to predict the nonlinear, convective-reactive, oxygen mass exchange in a cross-flow hollow fiber blood oxygenator. The numeric simulator also calculates the carbon dioxide mass exchange, as hemoglobin affinity to oxygen is affected by the local pH value, which depends mostly on the local carbon dioxide content in blood. Blood pH calculation inside the oxygenator is made by the simultaneous solution of an equation that takes into account the blood buffering capacity and the classical Henderson-Hasselbach equation. The modeling of the mass transfer conductance in the blood comprises a global factor, which is a function of the Reynolds number, and a local factor, which takes into account the amount of oxygen reacted to hemoglobin. The simulator is calibrated against experimental data for an in-line fiber bundle. The results are: (i) the calibration process allows the precise determination of the mass transfer conductance for both oxygen and carbon dioxide; (ii) very alkaline pH values occur in the blood path at the gas inlet side of the fiber bundle; (iii) the parametric analysis of the effect of the blood base excess (BE) shows that V(CO2) is similar in the case of blood metabolic alkalosis, metabolic acidosis, or normal BE, for a similar blood inlet P(CO2), although the condition of metabolic alkalosis is the worst case, as the pH in the vicinity of the gas inlet is the most alkaline; (iv) the parametric analysis of the effect of the gas flow to blood flow ratio (Q(G)/Q(B)) shows that V(CO2) variation with the gas flow is almost linear up to Q(G)/Q(B) = 2.0. V(O2) is not affected by the gas flow as it was observed that by increasing the gas flow up to eight times, the V(O2) grows only 1%. The mass exchange of carbon dioxide uses the full length of the hollow-fiber only if Q(G)/Q(B) > 2.0, as it was observed that only in this condition does the local variation of pH and blood P(CO2) comprise the whole fiber bundle.
Resumo:
The depletion of zeta-cypermethrin residues in bovine tissues and milk was studied. Beef cattle were treated three times at 3-week intervals with 1 ml 10 kg(-1) body weight of a 25 g litre(-1) or 50 g litre(-1) pour-on formulation (2.5 and 5.0 mg zeta-cypermethrin kg(-1) body weight) or 100 mg kg(-1) spray to simulate a likely worst-case treatment regime. Friesian and Jersey dairy cows were treated once with 2.5 mg zeta-cypermethrin kg(-1) in a pour-on formulation. Muscle, liver and kidney residue concentrations were generally less than the limit of detection (LOD = 0.01 mg kg(-1)). Residues in renal-fat and back-fat samples from animals treated with 2.5 mg kg(-1) all exceeded the limit of quantitation (LOQ = 0.05 mg kg(-1)), peaking at 10 days after treatment. Only two of five kidney fat samples were above the LOQ after 34 days, but none of the back-fat samples exceeded the LOQ at 28 days after treatment. Following spray treatments, fat residues were detectable in some animals but were below the LOQ at all sampling intervals. Zeta-cypermethrin was quantifiable (LOQ = 0.01 mg kg(-1)) in only one whole-milk sample from the Friesian cows (0.015 mg kg(-1), 2 days after treatment). In whole milk from Jersey cows, the mean concentration of zeta-cypermethrin peaked 1 day after treatment, at 0.015 mg kg(-1), and the highest individual sample concentration was 0.025 mg kg(-1) at 3 days after treatment. Residues in milk were not quantifiable beginning 4 days after treatment. The mean concentrations of zeta-cypermethrin in milk fat from Friesian and Jersey cows peaked two days after treatment at 0.197 mg kg(-1) and 0.377 mg kg(-1), respectively, and the highest individual sample concentrations were 2 days after treatment at 0.47 mg kg(-1) and 0.98 mg kg(-1), respectively. (C) 2001 Society of Chemical Industry.
Resumo:
Objective To measure the residues of spinosad and chlorhexidine in the tissues of sheep after treatment of blowfly strike. Procedure Fourteen sheep with natural myiasis and 12 with artificial infestations of Lucilia cuprina larvae had the wool removed over their infestations and were treated with an aerosol wound dressing containing spinosad and chlorhexidine. Sheep were killed up to 14 days after treatment and residues of the chemicals measured in tissues. Results Chlorhexidine was not detected in any tissue. Residues of spinosad were highest in fat, lowest in muscle and intermediate in liver and kidney. The highest residue detected was 0.2 mg/kg spinosad in perirenal fat 7 days after generous treatment of a sheep with a large fly strike. Residues of spinosad in fat peaked 3 to 7 days after treatment and 1 to 3 days after treatment in liver and kidney. Conclusion These studies present a realistic worst-case in struck sheep and at the highest dose studied, equivalent to 5.8 mg spinosad per kg body weight, the maximum residue detected of 0.2 mg/kg in peri-renal fat was 20% of the Australian maximum residue limit. Muscle, liver and kidney residues of spinosad were also below the Australian maximum residue limits at all times.
Resumo:
Some efficient solution techniques for solving models of noncatalytic gas-solid and fluid-solid reactions are presented. These models include those with non-constant diffusivities for which the formulation reduces to that of a convection-diffusion problem. A singular perturbation problem results for such models in the presence of a large Thiele modulus, for which the classical numerical methods can present difficulties. For the convection-diffusion like case, the time-dependent partial differential equations are transformed by a semi-discrete Petrov-Galerkin finite element method into a system of ordinary differential equations of the initial-value type that can be readily solved. In the presence of a constant diffusivity, in slab geometry the convection-like terms are absent, and the combination of a fitted mesh finite difference method with a predictor-corrector method is used to solve the problem. Both the methods are found to converge, and general reaction rate forms can be treated. These methods are simple and highly efficient for arbitrary particle geometry and parameters, including a large Thiele modulus. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.
Resumo:
Earthquakes and tsunamis along Morocco's coasts have been reported since historical times. The threat posed by tsunamis must be included in coastal risk studies. This study focuses on the tsunami impact and vulnerability assessment of the Casablanca harbour and surrounding area using a combination of tsunami inundation numerical modelling, field survey data and geographic information system. The tsunami scenario used here is compatible with the 1755 Lisbon event that we considered to be the worst case tsunami scenario. Hydrodynamic modelling was performed with an adapted version of the Cornell Multigrid Coupled Tsunami Model from Cornell University. The simulation covers the eastern domain of the Azores-Gibraltar fracture zone corresponding to the largest tsunamigenic area in the North Atlantic. The proposed vulnerability model attempts to provide an insight into the tsunami vulnerability of building stock. Results in the form of a vulnerability map will be useful for decision makers and local authorities in preventing the community resiliency for tsunami hazards.
Resumo:
This paper presents an artificial neural network applied to the forecasting of electricity market prices, with the special feature of being dynamic. The dynamism is verified at two different levels. The first level is characterized as a re-training of the network in every iteration, so that the artificial neural network can able to consider the most recent data at all times, and constantly adapt itself to the most recent happenings. The second level considers the adaptation of the neural network’s execution time depending on the circumstances of its use. The execution time adaptation is performed through the automatic adjustment of the amount of data considered for training the network. This is an advantageous and indispensable feature for this neural network’s integration in ALBidS (Adaptive Learning strategic Bidding System), a multi-agent system that has the purpose of providing decision support to the market negotiating players of MASCEM (Multi-Agent Simulator of Competitive Electricity Markets).