941 resultados para Characteristic Initial Value Problem
Resumo:
A detailed investigation has been conducted on core samples taken from 17 portland cement concrete pavements located in Iowa. The goal of the investigation was to help to clarify the root cause of the premature deterioration problem that has become evident since the early 1990s. Laboratory experiments were also conducted to evaluate how cement composition, mixing time, and admixtures could have influenced the occurrence of premature deterioration. The cements used in this study were selected in an attempt to cover the main compositional parameters pertinent to the construction industry in Iowa. The hardened air content determinations conducted during this study indicated that the pavements that exhibited premature deterioration often contained poor to marginal entrained-air void systems. In addition, petrographic studies indicated that sometimes the entrained-air void system had been marginal after mixing and placement of the pavement slab, while in other instances a marginal to adequate entrained-air void system had been filled with ettringite. The filling was most probably accelerated because of shrinkage cracking at the surface of the concrete pavements. The results of this study suggest that the durability—more sciecifically, the frost resistance—of the concrete pavements should be less than anticipated during the design stage of the pavements. Construction practices played a significant role in the premature deterioration problem. The pavements that exhibited premature distress also exhibited features that suggested poor mixing and poor control of aggregate grading. Segregation was very common in the cores extracted from the pavements that exhibited premature distress. This suggests that the vibrators on the paver were used to overcome a workability problem. Entrained-air voids formed in concrete mixtures experiencing these types of problems normally tend to be extremely coarse, and hence they can easily be lost during the paving process. This tends to leave the pavement with a low air content and a poor distribution of air voids. All of these features were consistent with a premature stiffening problem that drastically influenced the ability of the contractor to place the concrete mixture. Laboratory studies conducted during this project indicated that most premature stiffening problems can be directly attributed to the portland cement used on the project. The admixtures (class C fly ash and water reducer) tended to have only a minor influence on the premature stiffening problem when they were used at the dosage rates described in this study.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Purpose: To describe the evolution of retinal thickness in eyes affected with acute anterior uveitis (AAU) in the course of follow-up and to assess its correlation with severity of inflammatory activity in the anterior chamber. Methods: Design: Prospective, cohort study Setting: Institutional study Patient population: 72 eyes (affected and fellow eyes) of 36 patients Observation procedure: Patients were followed daily until beginning of resolution of inflammatory activity and weekly thereafter. Optical coherence tomography and laser flare photometry were performed at each visit. Treatment consisted of topical corticosteroids Main outcome measures: Retinal thickness of affected eyes, difference in retinal thickness between affected and fellow eyes and their evolution in time, association between maximal retinal thickness and initial laser flare photometry. Results: Difference in retinal thickness between affected and fellow eyes became significant on average seven days from baseline and remained so through-out follow-up (p<0.001). There was a steep increase in retinal thickness of affected eyes followed by a progressive decrease after reaching a peak value. Maximal difference in retinal thickness between affected and fellow eyes was observed between 17 and 25 days from baseline and exhibited a strong, positive correlation with initial laser flare photometry values (p=0.015). Conclusions: Retinal thickness in eyes affected with AAU presents a steep increase over 3 to 4 weeks and then gradually decreases. Severity of inflammation at baseline predicts the amount of retinal thickening in affected eyes. A characteristic pattern of temporal response of retinal anatomy to inflammatory stimuli seems to arise.
Resumo:
The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.
Resumo:
The subject "Value and prices in Russian economic thought (1890--1920)" should evoke several names and debates in the reader's mind. For a long time, Western scholars have been aware that the Russian economists Tugan-Baranovsky and Bortkiewicz were active participants to the Marxian transformation problem, that the mathematical models of Dmitriev prefigured forthcoming neoricardian based models, and that many Russian economists were either supporting the Marxian labour theory of value or being revisionists. Moreover, these ideas were preparing the ground for Soviet planning. Russian scholars additionally knew that this period was the time of introduction of marginalism in Russia, and that, during this period, economists were active in thinking the relation of ethics with economic theory. All these issues are well covered in the existing literature. But there is a big gap that this dissertation intends to fill. The existing literature handles these pieces separately, although they are part of a single, more general, history. All these issues (the labour theory of value, marginalism, the Marxian transformation problem, planning, ethics, mathematical economics) were part of what this dissertation calls here "The Russian synthesis". The Russian synthesis (in the singular) designates here all the attempts at synthesis between classical political economy and marginalism, between labour theory of value and marginal utility, and between value and prices that occurred in Russian economic thought between 1890 and 1920, and that embraces the whole set of issues evoked above. This dissertation has the ambition of being the first comprehensive history of that Russian synthesis. In this, this contribution is unique. It has always surprised the author of the present dissertation that such a book has not yet been written. Several good reasons, both in terms of scarce availability of sources and of ideological restrictions, may accounted for a reasonable delay of several decades. But it is now urgent to remedy the situation before the protagonists of the Russian synthesis are definitely classified under the wrong labels in the pantheon of economic thought. To accomplish this task, it has seldom be sufficient to gather together the various existing studies on aspects of this story. It as been necessary to return to the primary sources in the Russian language. The most important part of the primary literature has never been translated, and in the last years only some of them have been republished in Russian. Therefore, most translations from the Russian have been made by the author of the present dissertation. The secondary literature has been surveyed in the languages that are familiar (Russian, English and French) or almost familiar (German) to the present author, and which are hopefully the most pertinent to the present investigation. Besides, and in order to increase the acquaintance with the text, which was the objective of all this, some archival sources were used. The analysis consists of careful chronological studies of the authors' writings and their evolution in their historical and intellectual context. As a consequence, the dissertation brings new authors to the foreground - Shaposhnikov and Yurovsky - who were traditionally confined to the substitutes' bench, because they only superficially touched the domains quoted above. In the Russian synthesis however, they played an important part of the story. As a side effect, some authors that used to play in the foreground - Dmitriev and Bortkiewicz - are relegated to the background, but are not forgotten. Besides, the dissertation refreshes the views on authors already known, such as Ziber and, especially, Tugan-Baranovsky. The ultimate objective of this dissertation is to change the opinion that one could have on "value and prices in Russian economic thought", by setting the Russian synthesis at the centre of the debates.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
The speed and width of front solutions to reaction-dispersal models are analyzed both analytically and numerically. We perform our analysis for Laplace and Gaussian distribution kernels, both for delayed and nondelayed models. The results are discussed in terms of the characteristic parameters of the models
Resumo:
A delta(34)S value of +6.3 +/- 1.5% was estimated for the rhyodacitic degassing magma present underneath the hydrothermal system of Nisyros, based on the S isotope ratios of H2S in fumarolic vapors. This value was estimated by modeling the irreversible water-rock mass transfers occurring during the generation of the hydrothermal liquid which separates these fumarolic vapors. The S isotope ratio of the rhyodacitic degassing magma of Nisyros is consistent with fractional crystallization of a parent basaltic magma with an initial delta(34)S value of +4% (+/-at least 1.5%). This positive value could be explained by mantle contamination due to by either transference of fluids derived from subducted materials or involvement of altered oceanic crust, whereas contribution of biogenic sulfides from sediments seems to be negligible or nil. This conclusion agrees with the lack of N-2 and CO2 from thermal decomposition of organic matter contained in subducted sediments, which is a characteristic of the whole Aegean arc system. Since hydrothermal S at Milos and Santorini has isotope ratios similar to those determined at Nisyros, it seems likely that common controlling processes are active throughout the Aegean island arc. (C) 2002 Elsevier, Science B.V. All rights reserved.
Resumo:
The HOT study (hypertension-optimal treatment) is an international clinical study on primary prevention of cardiovascular events in 19,193 hypertensive patients worldwide. It aims at the recognition of the optimal diastolic blood pressure value (< 90, < 85 or < 80 mmHg?) in order to maximize the possible benefit of an antihypertensive therapy. In addition, the HOT study investigates whether low doses of aspirin (75 mg/day) are able to reduce the occurrence of severe cardiovascular events. In Switzerland a total of 797 patients have been enrolled in the study. Antihypertensive therapy was initiated with felodipine = Plendil (5 mg/day). This vasoelective calcium antagonist could reduce diastolic blood pressure values to < 90 or < 80 mg/Hg, respectively, in one of two or one of three patients within the first three months. In nine or six patients, respectively out of ten a reduction of diastolic blood pressure values to < 90 or < 80 mmHg was reached within one year by combination of felodipine with other antihypertensive drugs (ACE inhibitors, beta blockers and diuretics).
Resumo:
AIM: MRI and PET with 18F-fluoro-ethyl-tyrosine (FET) have been increasingly used to evaluate patients with gliomas. Our purpose was to assess the additive value of MR spectroscopy (MRS), diffusion imaging and dynamic FET-PET for glioma grading. PATIENTS, METHODS: 38 patients (42 ± 15 aged, F/M: 0.46) with untreated histologically proven brain gliomas were included. All underwent conventional MRI, MRS, diffusion sequences, and FET-PET within 3±4 weeks. Performances of tumour FET time-activity-curve, early-to-middle SUVmax ratio, choline / creatine ratio and ADC histogram distribution pattern for gliomas grading were assessed, as compared to histology. Combination of these parameters and respective odds were also evaluated. RESULTS: Tumour time-activity-curve reached the best accuracy (67%) when taken alone to distinguish between low and high-grade gliomas, followed by ADC histogram analysis (65%). Combination of time-activity-curve and ADC histogram analysis improved the sensitivity from 67% to 86% and the specificity from 63-67% to 100% (p < 0.008). On multivariate logistic regression analysis, negative slope of the tumour FET time-activity-curve however remains the best predictor of high-grade glioma (odds 7.6, SE 6.8, p = 0.022). CONCLUSION: Combination of dynamic FET-PET and diffusion MRI reached good performance for gliomas grading. The use of FET-PET/MR may be highly relevant in the initial assessment of primary brain tumours.
Resumo:
We develop a singular perturbation approach to the problem of the calculation of a characteristic time (the nonlinear relaxation time) for non-Markovian processes driven by Gaussian colored noise with small correlation time. Transient and initial preparation effects are discussed and explicit results for prototype situations are obtained. New effects on the relaxation of unstable states are predicted. The approach is compared with previous techniques.
Resumo:
The general theory of nonlinear relaxation times is developed for the case of Gaussian colored noise. General expressions are obtained and applied to the study of the characteristic decay time of unstable states in different situations, including white and colored noise, with emphasis on the distributed initial conditions. Universal effects of the coupling between colored noise and random initial conditions are predicted.
Resumo:
PURPOSE: The current study tested the applicability of Jessor's problem behavior theory (PBT) in national probability samples from Georgia and Switzerland. Comparisons focused on (1) the applicability of the problem behavior syndrome (PBS) in both developmental contexts, and (2) on the applicability of employing a set of theory-driven risk and protective factors in the prediction of problem behaviors. METHODS: School-based questionnaire data were collected from n = 18,239 adolescents in Georgia (n = 9499) and Switzerland (n = 8740) following the same protocol. Participants rated five measures of problem behaviors (alcohol and drug use, problems because of alcohol and drug use, and deviance), three risk factors (future uncertainty, depression, and stress), and three protective factors (family, peer, and school attachment). Final study samples included n = 9043 Georgian youth (mean age = 15.57; 58.8% females) and n = 8348 Swiss youth (mean age = 17.95; 48.5% females). Data analyses were completed using structural equation modeling, path analyses, and post hoc z-tests for comparisons of regression coefficients. RESULTS: Findings indicated that the PBS replicated in both samples, and that theory-driven risk and protective factors accounted for 13% and 10% in Georgian and Swiss samples, respectively in the PBS, net the effects by demographic variables. Follow-up z-tests provided evidence of some differences in the magnitude, but not direction, in five of six individual paths by country. CONCLUSION: PBT and the PBS find empirical support in these Eurasian and Western European samples; thus, Jessor's theory holds value and promise in understanding the etiology of adolescent problem behaviors outside of the United States.
Resumo:
A new experimental system to measure the equivalent thermal conductivity of a liquid with regard to the Bénard-Rayleigh problem was constructed. The liquid is enclosed within walls of polymethylmethacrylate between two copper plates in which there are thermocouples to measure the difference in temperature between the lower and upper surfaces of the layer of liquid. Heat flux is measured by means of a linear heat fluxmeter consisting of 204 thermocouples in series. The fluxmeter was calibrated and the linear relationship that exists between the heat flux and the emf generated was verified. The thermal conductivity of the polymethylmethacrylate employed was measured and measurements of the equivalent conductivity in cylindrical boundaries of two silicone oils were made. The critical value of the temperature difference and the contribution of the convective process to the transmission of heat were determined.
Resumo:
A controlled perturbation is introduced into the Saffman-Taylor flow problem by adding a gradient to the gap of a Hele-Shaw cell. The stability of the single-finger steady state was found to be strongly affected by such a perturbation. Compared with patterns in a standard Hele-Shaw cell, the single Saffman-Taylor finger was stabilized or destabilized according to the sign of the gap gradient. While a linear stability analysis shows that this perturbation should have a negligible effect on the early-stage pattern formation, the experimental data indicate that the characteristic length for the initial breakup of a flat interface has been changed by the perturbation.