767 resultados para real world learning
Resumo:
Countries in Latin America were among the first to implement routine vaccination against species A rotavirus (RVA). We evaluate data from Latin America on reductions in gastroenteritis and RVA disease burden following the introduction of RVA vaccine. Published literature was reviewed to identify case-control studies of vaccine effectiveness and population-based studies examining longitudinal trends of diarrhoeal disease reduction after RVA vaccine introduction in Latin American countries. RVA vaccine effectiveness and impact on gastroenteritis mortality and hospitalization rates and RVA hospitalization rates are described. Among middle-income Latin American countries with published data (Mexico, Brazil, El Salvador and Panama), RVA vaccine contributed to a gastroenteritis-associated mortality reduction of 22-41%, a gastroenteritis-associated hospitalization reduction of 17-51% and a RVA hospitalization reduction of 59-81% among children younger than five years of age. In Brazil and El Salvador, case-control studies demonstrated that a full RVA vaccination schedule was 76-85% effective against RVA hospitalization; a lower effectiveness of 46% was seen in Nicaragua, the only low-income country with available data. A growing body of literature offers convincing evidence of "real world" vaccine program successes in Latin American settings, which may be expanded as more countries in the region include RVA vaccine in their immunization programs.
Resumo:
Fine particulate matter from traffic increases mortality and morbidity. An important source of traffic particles is brake wear. American studies reported cars to emit break wear particles at a rate of about 11mg/km to 20mg/km of driven distance. A German study estimated that break wear contributes about 12.5% to 21% of the total traffic particle emissions. The goal of this study was to build a system that allows the study of brake wear particle emissions during different braking behaviours of different car and brake types. The particles should be characterize in terms of size, number, metal, and elemental and organic carbon composition. In addition, the influence of different deceleration schemes on the particle composition and size distribution should be studied. Finally, this system should allow exposing human cell cultures to these particles. An exposure-box (0.25 cubic-m volume) was built that can be mounted around a car's braking system. This allows exposing cells to fresh brake wear particles. Concentrations of particle numbers, mass and surface, metals, and carbon compounds were quantified. Tests were conducted with A549 lung epithelial cells. Five different cars and two typical braking behaviours (full stop and normal deceleration) were tested. Particle number and size distribution was analysed for the first six minutes. In this time, two braking events occurred. Full stop produced significantly higher particle concentrations than normal deceleration (average of 23'000 vs. 10'400 #/cm3, p= 0.016). The particle number distribution was bi-modal with one peak at 60 to 100 nm (depending on the tested car and braking behaviour) and a second peak at 200 to 400 nm. Metal concentrations varied depending on the tested car type. Iron (range of 163 to 15'600 μg/m3) and Manganese (range of 0.9 to 135 μg/m3) were present in all samples, while Copper was absent in some samples (<6 to 1220 μg/m3). The overall "fleet" metal ratio was Fe:Cu:Mn = 128:14:1. Temperature and humidity varied little. A549-cells were successfully exposed in the various experimental settings and retained their viability. Culture supernatant was stored and cell culture samples were fixated to test for inflammatory response. Analysis of these samples is ongoing. The established system allowed testing brake wear particle emissions from real-world cars. The large variability of chemical composition and emitted amounts of brake wear particles between car models seems to be related to differences between brake pad compositions of different producers. Initial results suggest that the conditions inside the exposure box allow exposing human lung epithelial cells to freshly produced brake wear particles.
Resumo:
An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported.
Resumo:
The debate on the merits of observational studies as compared with randomized trials is ongoing. We will briefly touch on this subject, and demonstrate the role of cohort studies for the description of infectious disease patterns after transplantation. The potential benefits of cohort studies for the clinical management of patients outside of the expected gain in epidemiological knowledge are reviewed. The newly established Swiss Transplantation Cohort Study and in particular the part focusing on infectious diseases will serve as an illustration. A neglected area of research is the indirect value of large, multicenter cohort studies. These benefits can range from a deepened collaboration to the development of common definitions and guidelines. Unfortunately, very few data exist on the role of such indirect effects on improving quality of patient management. This review postulates an important role for cohort studies, which should not be viewed as inferior but complementary to established research tools, in particular randomized trials. Randomized trials remain the least bias-prone method to establish knowledge regarding the significance of diagnostic or therapeutic measures. Cohort studies have the power to reflect a real-world situation and to pinpoint areas of knowledge as well as of uncertainty. Prerequisite is a prospective design requiring a set of inclusive data coupled with the meticulous insistence on data retrieval and quality.
Resumo:
Politics must tackle multiple issues at once. In a first-best world, political competition constrains parties to prioritize issues according to the voters' true concerns. In the real world, the opposite also happens: parties manipulate voter priorities by emphasizing issues selectively during the political campaign. This phenomenon, known as priming, should allow parties to pay less attention to the issues that they intend to mute. We develop a model of endogenous issue ownership in which two vote-seeking parties (i) invest to attract voters with "better" policy proposals and (ii) choose a communication campaign to focus voter attention on specific issues. We identify novel feedbacks between communication and investment. In particular, we find that stronger priming effects can backfire by constraining parties to invest more resources in all issues, including the ones they would otherwise intend to mute. We also identify under which conditions parties prefer to focus on their "historical issues" or to engage in issue stealing. Typically, the latter happens when priming effects are strong, and historical reputations differentiates parties less.
Resumo:
Tugan-Baranovsky's ideas on socialism are reconstructed with an emphasis on the relation between political economy and utopia. Utopia enters the stage after the critique of capitalism, in the definition of the realm of possibilities in the world of ideas. With the help of ethics, the notion of ideal socialism, unreachable by definition, is defined in the sphere of utopia. Thus, the task of political economy is first to show which of these possible worlds are reachable in the real world, and second to choose the one that conforms better to ideal socialism: this is socialism in practice through the economic plan. Thus, far from considering utopia and science as contradictory, Tugan-Baranovsky saw them as complementary, and his socialism is the result of the dialogue he instituted between them.
Resumo:
Background: Microarray data is frequently used to characterize the expression profile of a whole genome and to compare the characteristics of that genome under several conditions. Geneset analysis methods have been described previously to analyze the expression values of several genes related by known biological criteria (metabolic pathway, pathology signature, co-regulation by a common factor, etc.) at the same time and the cost of these methods allows for the use of more values to help discover the underlying biological mechanisms. Results: As several methods assume different null hypotheses, we propose to reformulate the main question that biologists seek to answer. To determine which genesets are associated with expression values that differ between two experiments, we focused on three ad hoc criteria: expression levels, the direction of individual gene expression changes (up or down regulation), and correlations between genes. We introduce the FAERI methodology, tailored from a two-way ANOVA to examine these criteria. The significance of the results was evaluated according to the self-contained null hypothesis, using label sampling or by inferring the null distribution from normally distributed random data. Evaluations performed on simulated data revealed that FAERI outperforms currently available methods for each type of set tested. We then applied the FAERI method to analyze three real-world datasets on hypoxia response. FAERI was able to detect more genesets than other methodologies, and the genesets selected were coherent with current knowledge of cellular response to hypoxia. Moreover, the genesets selected by FAERI were confirmed when the analysis was repeated on two additional related datasets. Conclusions: The expression values of genesets are associated with several biological effects. The underlying mathematical structure of the genesets allows for analysis of data from several genes at the same time. Focusing on expression levels, the direction of the expression changes, and correlations, we showed that two-step data reduction allowed us to significantly improve the performance of geneset analysis using a modified two-way ANOVA procedure, and to detect genesets that current methods fail to detect.
Resumo:
BACKGROUND/AIMS: Switzerland's drug policy model has always been unique and progressive, but there is a need to reassess this system in a rapidly changing world. The IMPROVE study was conducted to gain understanding of the attitudes and beliefs towards opioid maintenance therapy (OMT) in Switzerland with regards to quality and access to treatment. To obtain a "real-world" view on OMT, the study approached its goals from two different angles: from the perspectives of the OMT patients and of the physicians who treat patients with maintenance therapy. The IMPROVE study collected a large body of data on OMT in Switzerland. This paper presents a small subset of the dataset, focusing on the research design and methodology, the profile of the participants and the responses to several key questions addressed by the questionnaires. METHODS: IMPROVE was an observational, questionnaire-based cross-sectional study on OMT conducted in Switzerland. Respondents consisted of OMT patients and treating physicians from various regions of the country. Data were collected using questionnaires in German and French. Physicians were interviewed by phone with a computer-based questionnaire. Patients self-completed a paper-based questionnaire at the physicians' offices or OMT treatment centres. RESULTS: A total of 200 physicians and 207 patients participated in the study. Liquid methadone and methadone tablets or capsules were the medications most commonly prescribed by physicians (60% and 20% of patient load, respectively) whereas buprenorphine use was less frequent. Patients (88%) and physicians (83%) were generally satisfied with the OMT currently offered. The current political framework and lack of training or information were cited as determining factors that deter physicians from engaging in OMT. About 31% of OMT physicians interviewed were ≥60 years old, indicating an ageing population. Diversion and misuse were considered a significant problem in Switzerland by 45% of the physicians. CONCLUSION: The subset of IMPROVE data presented gives a present-day, real-life overview of the OMT landscape in Switzerland. It represents a valuable resource for policy makers, key opinion leaders and drug addiction researchers and will be a useful basis for improving the current Swiss OMT model.
Resumo:
L'objectiu principal d'aquest projecte ha estat aprofundir en la construcció de programari, abordant totes les etapes d'un projecte de construcció de programari des de la perspectiva de l'enginyeria del software (anàlisi, disseny, implementació i proves) i utilitzant el paradigma de programació Orientada a l'Objecte mitjançant l'ús de la tecnologia J2EE, conjuntament amb bastions de programari de gran importància en el mon real i per tant, en l'àmbit de desenvolupament de programari i tecnològic actuals.
Resumo:
Tone Mapping is the problem of compressing the range of a High-Dynamic Range image so that it can be displayed in a Low-Dynamic Range screen, without losing or introducing novel details: The final image should produce in the observer a sensation as close as possible to the perception produced by the real-world scene. We propose a tone mapping operator with two stages. The first stage is a global method that implements visual adaptation, based on experiments on human perception, in particular we point out the importance of cone saturation. The second stage performs local contrast enhancement, based on a variational model inspired by color vision phenomenology. We evaluate this method with a metric validated by psychophysical experiments and, in terms of this metric, our method compares very well with the state of the art.
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
This paper examines the incentive of atomistic agricultural producers within a specific geographical region to differentiate and collectively market products. We develop a model that allows us to analyze the market and welfare effects of the main types of real-world producer organizations, using it to derive economic insights regarding the circumstances under which these organizations will evolve, and describing implications of the results obtained in the context of an ongoing debate between the European Union and United States. As the anticipated fixed costs of development and marketing increase and the anticipated size of the market falls, it becomes essential to increase the ability of the producer organization to control supply in order to ensure the coverage of fixed costs. Whenever a collective organization allows a market (with a new product) to exist that otherwise would not have existed there is an increase in societal welfare. Counterintuitively, stronger property right protection for producer organizations may be welfare enhancing even after a differentiated product has been developed. The reason for this somewhat paradoxical result is that legislation aimed at curtailing the market power of producer organizations may induce large technological distortions.
Resumo:
The past four decades have witnessed an explosive growth in the field of networkbased facility location modeling. This is not at all surprising since location policy is one of the most profitable areas of applied systems analysis in regional science and ample theoretical and applied challenges are offered. Location-allocation models seek the location of facilities and/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or several objectives generally related to the efficiency of the system or to the allocation of resources. This paper concerns the location of facilities or services in discrete space or networks, that are related to the public sector, such as emergency services (ambulances, fire stations, and police units), school systems and postal facilities. The paper is structured as follows: first, we will focus on public facility location models that use some type of coverage criterion, with special emphasis in emergency services. The second section will examine models based on the P-Median problem and some of the issues faced by planners when implementing this formulation in real world locational decisions. Finally, the last section will examine new trends in public sector facility location modeling.