994 resultados para reasonable time


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last century, mathematical optimization has become a prominent tool for decision making. Its systematic application in practical fields such as economics, logistics or defense led to the development of algorithmic methods with ever increasing efficiency. Indeed, for a variety of real-world problems, finding an optimal decision among a set of (implicitly or explicitly) predefined alternatives has become conceivable in reasonable time. In the last decades, however, the research community raised more and more attention to the role of uncertainty in the optimization process. In particular, one may question the notion of optimality, and even feasibility, when studying decision problems with unknown or imprecise input parameters. This concern is even more critical in a world becoming more and more complex —by which we intend, interconnected —where each individual variation inside a system inevitably causes other variations in the system itself. In this dissertation, we study a class of optimization problems which suffer from imprecise input data and feature a two-stage decision process, i.e., where decisions are made in a sequential order —called stages —and where unknown parameters are revealed throughout the stages. The applications of such problems are plethora in practical fields such as, e.g., facility location problems with uncertain demands, transportation problems with uncertain costs or scheduling under uncertain processing times. The uncertainty is dealt with a robust optimization (RO) viewpoint (also known as "worst-case perspective") and we present original contributions to the RO literature on both the theoretical and practical side.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Slot and van Emde Boas Invariance Thesis states that a time (respectively, space) cost model is reasonable for a computational model C if there are mutual simulations between Turing machines and C such that the overhead is polynomial in time (respectively, linear in space). The rationale is that under the Invariance Thesis, complexity classes such as LOGSPACE, P, PSPACE, become robust, i.e. machine independent. In this dissertation, we want to find out if it possible to define a reasonable space cost model for the lambda-calculus, the paradigmatic model for functional programming languages. We start by considering an unusual evaluation mechanism for the lambda-calculus, based on Girard's Geometry of Interaction, that was conjectured to be the key ingredient to obtain a space reasonable cost model. By a fine complexity analysis of this schema, based on new variants of non-idempotent intersection types, we disprove this conjecture. Then, we change the target of our analysis. We consider a variant over Krivine's abstract machine, a standard evaluation mechanism for the call-by-name lambda-calculus, optimized for space complexity, and implemented without any pointer. A fine analysis of the execution of (a refined version of) the encoding of Turing machines into the lambda-calculus allows us to conclude that the space consumed by this machine is indeed a reasonable space cost model. In particular, for the first time we are able to measure also sub-linear space complexities. Moreover, we transfer this result to the call-by-value case. Finally, we provide also an intersection type system that characterizes compositionally this new reasonable space measure. This is done through a minimal, yet non trivial, modification of the original de Carvalho type system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proceedings of the Information Technology Applications in Biomedicine, Ioannina - Epirus, Greece, October 26-28, 2006

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of four Sanitizers - peracetic acid, chlorhexidine, quaternary ammonium, and organic acids - was tested in this work using different bacteria recognized as a problem to meat industry, Salmonella sp., S. aureus, E. coli and L. monocytogenes. The effects of sanitizer concentration (0.2, 0.5, 0.6, 1.0, 1.1 and 1.4%), at different temperatures (10 and 45 °C) and contact time (2, 10, 15, 18 and 25 minutes) were evaluated. Tests in an industrial plant were also carried out considering previously obtained results. In a general way, peracetic acid presented higher efficiencies using low concentration (0.2%) and contact time (2 minutes) at 10 °C. The tests performed in industrial scale showed that peracetic acid presented a good performance in concentration and contact time lower than that suggested by the suppliers. The use of chlorhexidine and quaternary ammonium led to reasonable results at the indicated conditions, and organic acids were ineffective under concentration and contact time higher than those indicated by the suppliers in relation to Staphylococcus aureus. The results, in general, show that the choice for the most adequate sanitizer depends on the microorganism contaminant, the time available for sanitizer application, and also on the process cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-resolved kinetic studies of the reaction of silylene, SiH2, generated by laser flash photolysis of phenylsilane, have been carried out to obtain rate constants for its bimolecular reaction with NO. The reaction was studied in the gas phase over the pressure range 1-100 Torr in SF6 bath gas at five temperatures in the range 299-592 K. The second-order rate constants at 10 Torr fitted the Arrhenius equation log(k/cm(3) molecule(-1) s(-1)) = (- 11.66 +/- 0.01) + (6.20 +/- 0.10 kJ mol(-1))IRT In 10 The rate constants showed a variation with pressure of a factor of ca. 2 over the available range, almost independent of temperature. The data could not be fitted by RRKM calculations to a simple third body assisted association reaction alone. However, a mechanistic model with an additional (pressure independent) side channel gave a reasonable fit to the data. Ab initio calculations at the G3 level supported a mechanism in which the initial adduct, bent H2SiNO, can ring close to form cyclo-H2SiNO, which is partially collisionally stabilized. In addition, bent H2SiNO can undergo a low barrier isomerization reaction leading, via a sequence of steps, ultimately to dissociation products of which the lowest energy pair are NH2 + SiO. The rate controlling barrier for this latter pathway is only 16 kJ mol(-1) below the energy of SiH2 + NO. This is consistent with the kinetic findings. A particular outcome of this work is that, despite the pressure dependence and the effects of the secondary barrier (in the side reaction), the initial encounter of SiH2 with NO occurs at the collision rate. Thus, silylene can be as reactive with odd electron molecules as with many even electron species. Some comparisons are drawn with the reactions of CH2 + NO and SiCl2 + NO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-resolved kinetic studies of the reaction of germylene, GeH2, generated by laser. ash photolysis of 3,4-dimethyl-1-germacyclopent-3-ene, have been carried out to obtain rate constants for its bimolecular reaction with 2-butyne, CH3C CCH3. The reaction was studied in the gas phase over the pressure range 1-100 Torr in SF6 bath gas, at five temperatures in the range 300-556 K. The second order rate constants obtained by extrapolation to the high pressure limits at each temperature, fitted the Arrhenius equation: log(k(infinity)/cm(3) molecule(-1) s(-1)) = (-10.46 +/- 10.06) + (5.16 +/- 10.47) kJ mol(-1)/ RT ln 10 Calculations of the energy surface of the GeC4H8 reaction system were carried out employing the additivity principle, by combining previous quantum chemical calculations of related reaction systems. These support formation of 1,2-dimethylvinylgermylene (rather than 2,3-dimethylgermirene) as the end product. RRKM calculations of the pressure dependence of the reaction are in reasonable agreement with this finding. The reactions of GeH2 with C2H2 and with CH3CRCCH3 are compared and contrasted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy. The accuracy was reduced in urban areas partly because of TerraSAR-X’s restricted visibility of the ground surface due to radar shadow and layover.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management and flood forecasting. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-of-flight measurements were carried out in orthorhombic sulfur for various fields, ranging from -2 to -20 kV/cm. No dependence of the mobility with the electric field was found but the current, normalized by the initial current, showed an electric field dependence at small times, decaying faster for larger electric field. After the failure of the usual models in explaining the resultsincluding the assumption of depth-dependent density of trapsa model assuming an extra mobility channel near the surface provided a reasonable set of parameters independent of the electric field. The measurements were carried out at 8.5, 29, 53, 68, and 79°C. © 1988 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To verify whether the number of chewing strokes and the chewing time are influenced by dentofacial deformities in habitual free mastication. METHODS: Participants were 15 patients with diagnosis of class II dentofacial deformity (GII), 15 with class III (GIII), and 15 healthy control individuals with no deformity (CG). Free habitual mastication of a cornstarch cookie was analyzed, considering the number of chewing strokes and the time needed to complete two mastications. Strokes were counted by considering the opening and closing movements of the mandible. The time needed to consume each bite was determined using a digital chronometer, started after the placement of the food in the oral cavity and stopped when each portion was swallowed. RESULTS: There were no differences between groups regarding both the number of strokes and the chewing time. However, with regards to the number of strokes, CG and GII presented a significant concordance between the first and the second chewing situation, which was not observed in GIII. The analysis of time showed significant concordance between the first and second chewing situation in CG, reasonable concordance in GII, and discordance in GIII. CONCLUSION: Dentofacial deformities do not influence the number of chewing strokes or the chewing time. However, class III individuals do not show uniformity regarding these aspects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heart rate variability (HRV) and cardiorespiratory coordination, i.e. the temporal interplay between oscillations of heartbeat and respiration, reflect information related to the cardiovascular and autonomic nervous system. The purpose of this study was to investigate the relationship between spectral measures of HRV and measures of cardiorespiratory coordination. In 127 subjects from a normal population a 24 h Holter ECG was recorded. Average heart rate (HR) and the following HRV parameters were calculated: very low (VLF), low (LF) and high frequency (HF) oscillations and LF/HF. Cardiorespiratory coordination was quantified using average respiratory rate (RespR), the ratio of heart rate and respiratory rate (HRR), the phase coordination ratio (PCR) and the extent of cardiorespiratory coordination (PP). Pearson's correlation coefficient r was used to quantify the relationship between each pair of the variables across all subjects. HR and HRR correlated strongest during daytime (r = 0.89). LF/HF and PP showed a negative correlation to a reasonable degree (r = -0.69). During nighttime sleep these correlations decreased whereas the correlation between HRR and RespR (r = -0.47) as well as between HRR and PCR (r = 0.73) increased substantially. In conclusion, HRR and PCR deliver considerably different information compared to HRV measures whereas PP is partially linked reciprocally to LF/HF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For broadcasting purposes MIXED REALITY, the combination of real and virtual scene content, has become ubiquitous nowadays. Mixed Reality recording still requires expensive studio setups and is often limited to simple color keying. We present a system for Mixed Reality applications which uses depth keying and provides threedimensional mixing of real and artificial content. It features enhanced realism through automatic shadow computation which we consider a core issue to obtain realism and a convincing visual perception, besides the correct alignment of the two modalities and correct occlusion handling. Furthermore we present a possibility to support placement of virtual content in the scene. Core feature of our system is the incorporation of a TIME-OF-FLIGHT (TOF)-camera device. This device delivers real-time depth images of the environment at a reasonable resolution and quality. This camera is used to build a static environment model and it also allows correct handling of mutual occlusions between real and virtual content, shadow computation and enhanced content planning. The presented system is inexpensive, compact, mobile, flexible and provides convenient calibration procedures. Chroma-keying is replaced by depth-keying which is efficiently performed on the GRAPHICS PROCESSING UNIT (GPU) by the usage of an environment model and the current ToF-camera image. Automatic extraction and tracking of dynamic scene content is herewith performed and this information is used for planning and alignment of virtual content. An additional sustainable feature is that depth maps of the mixed content are available in real-time, which makes the approach suitable for future 3DTV productions. The presented paper gives an overview of the whole system approach including camera calibration, environment model generation, real-time keying and mixing of virtual and real content, shadowing for virtual content and dynamic object tracking for content planning.