812 resultados para Design approach
Resumo:
I discuss several lessons regarding the design and conduct of monetary policy that have emerged out of the New Keynesian research program. Those lessons include the bene.ts of price stability, the gains from commitment about future policies, the importance of nat-ural variables as benchmarks for policy, and the bene.ts of a credible anti-inflationary stance. I also point to one challenge facing NK modelling efforts: the need to come up with relevant sources of policy tradeoffs. A potentially useful approach to meeting that challenge, based on the introduction of real imperfections, is presented.
Resumo:
BACKGROUND: HIV-1 RNA viral load is a key parameter for reliable treatment monitoring of HIV-1 infection. Accurate HIV-1 RNA quantitation can be impaired by primer and probe sequence polymorphisms as a result of tremendous genetic diversity and ongoing evolution of HIV-1. A novel dual HIV-1 target amplification approach was realized in the quantitative COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 (HIV-1 TaqMan test v2.0) to cope with the high genetic diversity of the virus. OBJECTIVES AND STUDY DESIGN: The performance of the new assay was evaluated for sensitivity, dynamic range, precision, subtype inclusivity, diagnostic and analytical specificity, interfering substances, and correlation with the COBAS AmpliPrep/COBAS TaqMan HIV-1 (HIV-1 TaqMan test v1.0) predecessor test in patients specimens. RESULTS: The new assay demonstrated a sensitivity of 20 copies/mL, a linear measuring range of 20-10,000,000 copies/mL, with a lower limit of quantitation of 20 copies/mL. HIV-1 Group M subtypes and HIV-1 Group O were quantified within +/-0.3 log(10) of the assigned titers. Specificity was 100% in 660 tested specimens, no cross reactivity was found for 15 pathogens nor any interference for endogenous substances or 29 drugs. Good comparability with the predecessor assay was demonstrated in 82 positive patient samples. In selected clinical samples 35/66 specimens were found underquantitated in the predecessor assay; all were quantitated correctly in the new assay. CONCLUSIONS: The dual-target approach for the HIV-1 TaqMan test v2.0 enables superior HIV-1 Group M subtype coverage including HIV-1 Group O detection. Correct quantitation of specimens underquantitated in the HIV-1 TaqMan test v1.0 test was demonstrated.
Resumo:
Measurement of total energy expenditure may be crucial to an understanding of the relation between physical activity and disease and in order to frame public health intervention. To devise a self-administered physical activity frequency questionnaire (PAFQ), the following data-based approach was used. A 24-hour recall was administered to a random sample of 919 adult residents of Geneva, Switzerland. The data obtained were used to establish the list of activities (and their median duration) that contributed to 95% of the energy expended, separately for men and women. Activities that were trivial for the whole sample but that contributed to > or = 10% of an individual's energy expenditure were also selected. The final PAFQ lists 70 activities or group of activities with their typical duration. About 20 minutes are required for respondents to indicate the number of days and the number of hours per day that they performed each activity. The PAFQ method was validated against a heart rate monitor, a more objective method. The total energy estimated by the PAFQ in 41 volunteers correlated well (r = 0.76) with estimates using a heart rate monitor. The authors conclude that the design of their self-administered physical activity frequency questionnaire based on data from 24-hour recall appeared to accurately estimate energy expenditure.
Resumo:
The number of private gardens has increased in recent years, creating a more pleasant urban model, but not without having an environmental impact, including increased energy consumption, which is the focus of this study. The estimation of costs and energy consumption for the generic typology of private urban gardens is based on two simplifying assumptions: square geometry with surface areas from 25 to 500 m2 and hydraulic design with a single pipe. In total, eight sprinkler models have been considered, along with their possible working pressures, and 31 pumping units grouped into 5 series that adequately cover the range of required flow rates and pressures, resultin in 495 hydraulic designs repeated for two climatically different locations in the Spanish Mediterranean area (Girona and Elche). Mean total irrigation costs for the locality with lower water needs (Girona) and greater needs (Elche) were € 2,974 ha-¹ yr-¹ and € 3,383 ha-¹ yr-¹, respectively. Energy costs accounted for 11.4% of the total cost for the first location, and 23.0% for the second. While a suitable choice of the hydraulic elements of the setup is essential, as it may provide average energy savings of 77%, due to the low energy cost in relation to the cost of installation, the potential energy savings do not constitute a significant incentive for the irrigation system design. The low efficiency of the pumping units used in this type of garden is the biggest obstacle and constraint to achieving a high quality energy solution
Resumo:
Indoleamine 2,3-dioxygenase (IDO) is an important therapeutic target for the treatment of diseases such as cancer that involve pathological immune escape. We have used the evolutionary docking algorithm EADock to design new inhibitors of this enzyme. First, we investigated the modes of binding of all known IDO inhibitors. On the basis of the observed docked conformations, we developed a pharmacophore model, which was then used to devise new compounds to be tested for IDO inhibition. We also used a fragment-based approach to design and to optimize small organic molecule inhibitors. Both approaches yielded several new low-molecular weight inhibitor scaffolds, the most active being of nanomolar potency in an enzymatic assay. Cellular assays confirmed the potential biological relevance of four different scaffolds.
Resumo:
Many regional governments in developed countries design programs to improve the competitiveness of local firms. In this paper, we evaluate the effectiveness of public programs whose aim is to enhance the performance of firms located in Catalonia (Spain). We compare the performance of publicly subsidised companies (treated) with that of similar, but unsubsidised companies (non-treated). We use the Propensity Score Matching (PSM) methodology to construct a control group which, with respect to its observable characteristics, is as similar as possible to the treated group, and that allows us to identify firms which retain the same propensity to receive public subsidies. Once a valid comparison group has been established, we compare the respective performance of each firm. As a result, we find that recipient firms, on average, change their business practices, improve their performance, and increase their value added as a direct result of public subsidy programs.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensic science denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstrated its potential to distinguish chemically identical compounds coming from different sources. Despite the numerous applications of IRMS to a wide range of forensic materials, its implementation in a forensic framework is less straightforward than it appears. In addition, each laboratory has developed its own strategy of analysis on calibration, sequence design, standards utilisation and data treatment without a clear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose a methodological framework of the whole process using IRMS methods. We emphasize the importance of considering isotopic results as part of a whole approach, when applying this technology to a particular forensic issue. The process is divided into six different steps, which should be considered for a thoughtful and relevant application. The dissection of this process into fundamental steps, further detailed, enables a better understanding of the essential, though not exhaustive, factors that have to be considered in order to obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratory comparisons.
Resumo:
Many regional governments in developed countries design programs to improve the competitiveness of local firms. In this paper, we evaluate the effectiveness of public programs whose aim is to enhance the performance of firms located in Catalonia (Spain). We compare the performance of publicly subsidised companies (treated) with that of similar, but unsubsidised companies (non-treated). We use the Propensity Score Matching (PSM) methodology to construct a control group which, with respect to its observable characteristics, is as similar as possible to the treated group, and that allows us to identify firms which retain the same propensity to receive public subsidies. Once a valid comparison group has been established, we compare the respective performance of each firm. As a result, we find that recipient firms, on average, change their business practices, improve their performance, and increase their value added as a direct result of public subsidy programs.
Resumo:
Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments
Resumo:
We describe the design, calibration, and performance of surface forces apparatus with the capability of illumination of the contact interface for spectroscopic investigation using optical techniques. The apparatus can be placed in the path of a Nd-YAG laser for studies of the linear response or the second harmonic and sum-frequency generation from a material confined between the two surfaces. In addition to the standard fringes of equal chromatic order technique, which we have digitized for accurate and fast analysis, the distance of separation can be measured with a fiber-optic interferometer during spectroscopic measurements (2 Å resolution and 10 ms response time). The sample approach is accomplished through application of a motor drive, piezoelectric actuator, or electromagnetic lever deflection for variable degrees of range, sensitivity, and response time. To demonstrate the operation of the instrument, the stepwise expulsion of discrete layers of octamethylcyclotetrasiloxane from the contact is shown. Lateral forces may also be studied by using piezoelectric bimorphs to induce and direct the motion of one surface.
Resumo:
The Organization of the Thesis The remainder of the thesis comprises five chapters and a conclusion. The next chapter formalizes the envisioned theory into a tractable model. Section 2.2 presents a formal description of the model economy: the individual heterogeneity, the individual objective, the UI setting, the population dynamics and the equilibrium. The welfare and efficiency criteria for qualifying various equilibrium outcomes are proposed in section 2.3. The fourth section shows how the model-generated information can be computed. Chapter 3 transposes the model from chapter 2 in conditions that enable its use in the analysis of individual labor market strategies and their implications for the labor market equilibrium. In section 3.2 the Swiss labor market data sets, stylized facts, and the UI system are presented. The third section outlines and motivates the parameterization method. In section 3.4 the model's replication ability is evaluated and some aspects of the parameter choice are discussed. Numerical solution issues can be found in the appendix. Chapter 4 examines the determinants of search-strategic behavior in the model economy and its implications for the labor market aggregates. In section 4.2, the unemployment duration distribution is examined and related to search strategies. Section 4.3 shows how the search- strategic behavior is influenced by the UI eligibility and section 4.4 how it is determined by individual heterogeneity. The composition effects generated by search strategies in labor market aggregates are examined in section 4.5. The last section evaluates the model's replication of empirical unemployment escape frequencies reported in Sheldon [67]. Chapter 5 applies the model economy to examine the effects on the labor market equilibrium of shocks to the labor market risk structure, to the deep underlying labor market structure and to the UI setting. Section 5.2 examines the effects of the labor market risk structure on the labor market equilibrium and the labor market strategic behavior. The effects of alterations in the labor market deep economic structural parameters, i.e. individual preferences and production technology, are shown in Section 5.3. Finally, the UI setting impacts on the labor market are studied in Section 5.4. This section also evaluates the role of the UI authority monitoring and the differences in the Way changes in the replacement rate and the UI benefit duration affect the labor market. In chapter 6 the model economy is applied in counterfactual experiments to assess several aspects of the Swiss labor market movements in the nineties. Section 6.2 examines the two equilibria characterizing the Swiss labor market in the nineties, the " growth" equilibrium with a "moderate" UI regime and the "recession" equilibrium with a more "generous" UI. Section 6.3 evaluates the isolated effects of the structural shocks, while the isolated effects of the UI reforms are analyzed in section 6.4. Particular dimensions of the UI reforms, the duration, replacement rate and the tax rate effects, are studied in section 6.5, while labor market equilibria without benefits are evaluated in section 6.6. In section 6.7 the structural and institutional interactions that may act as unemployment amplifiers are discussed in view of the obtained results. A welfare analysis based on individual welfare in different structural and UI settings is presented in the eighth section. Finally, the results are related to more favorable unemployment trends after 1997. The conclusion evaluates the features embodied in the model economy with respect to the resulting model dynamics to derive lessons from the model design." The thesis ends by proposing guidelines for future improvements of the model and directions for further research.
Resumo:
The Iowa Department of Transportation has long recognized that approach slab pavements of integral abutment bridges are prone to settlement and cracking, which manifests as the “bump at the end of the bridge”. A commonly recommended solution is to integrally attach the approach slab to the bridge abutment. Two different approach slabs, one being precast concrete and the other being cast-inplace concrete, were integrally connected to side-by-side bridges and investigated. The primary objective of this investigation was to evaluate the approach slab performance and the impacts the approach slabs have on the bridge. To satisfy the research needs, the project scope involved a literature review, survey of Midwest Department of Transportation current practices, implementing a health monitoring system on the bridge and approach slab, interpreting the data obtained during the evaluation, and conducting periodic visual inspections. Based on the information obtained from the testing the following general conclusions were made: The integral connection between the approach slabs and the bridges appear to function well with no observed distress at this location and no relative longitudinal movement measured between the two components; Tying the approach slab to the bridge appears to impact the bridge; The two different approach slabs, the longer precast slab and the shorter cast-in-place slab, appear to impact the bridge differently; The measured strains in the approach slabs indicate a force exists at the expansion joint and should be taken into consideration when designing both the approach slab and the bridge; The observed responses generally followed an annual cyclic and/or short term cyclic pattern over time.
Resumo:
Bacterial reporters are live, genetically engineered cells with promising application in bioanalytics. They contain genetic circuitry to produce a cellular sensing element, which detects the target compound and relays the detection to specific synthesis of so-called reporter proteins (the presence or activity of which is easy to quantify). Bioassays with bacterial reporters are a useful complement to chemical analytics because they measure biological responses rather than total chemical concentrations. Simple bacterial reporter assays may also replace more costly chemical methods as a first line sample analysis technique. Recent promising developments integrate bacterial reporter cells with microsystems to produce bacterial biosensors. This lecture presents an in-depth treatment of the synthetic biological design principles of bacterial reporters, the engineering of which started as simple recombinant DNA puzzles, but has now become a more rational approach of choosing and combining sensing, controlling and reporting DNA 'parts'. Several examples of existing bacterial reporter designs and their genetic circuitry will be illustrated. Besides the design principles, the lecture also focuses on the application principles of bacterial reporter assays. A variety of assay formats will be illustrated, and principles of quantification will be dealt with. In addition to this discussion, substantial reference material is supplied in various Annexes.
Resumo:
Indoleamine 2,3-dioxygenase 1 (IDO1) is an important therapeutic target for the treatment of diseases such as cancer that involve pathological immune escape. Starting from the scaffold of our previously discovered IDO1 inhibitor 4-phenyl-1,2,3-triazole, we used computational structure-based methods to design more potent ligands. This approach yielded highly efficient low molecular weight inhibitors, the most active being of nanomolar potency both in an enzymatic and in a cellular assay, while showing no cellular toxicity and a high selectivity for IDO1 over tryptophan 2,3-dioxygenase (TDO). A quantitative structure-activity relationship based on the electrostatic ligand-protein interactions in the docked binding modes and on the quantum chemically derived charges of the triazole ring demonstrated a good explanatory power for the observed activities.
Resumo:
In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and predicted behavior of the bridge caused under a subset of ambient trucks. The predicted behavior is derived from a statistics-based model trained with field data from the undamaged bridge (not a finite element model). The differences between actual and predicted responses, called residuals, are then used to construct control charts, which compare undamaged and damaged structure data. Validation of the damage-detection approach was achieved by using sacrificial specimens that were mounted to the bridge and exposed to ambient traffic loads and which simulated actual damage-sensitive locations. Different damage types and levels were introduced to the sacrificial specimens to study the sensitivity and applicability. The damage-detection algorithm was able to identify damage, but it also had a high false-positive rate. An evaluation of the sub-components of the damage-detection methodology and methods was completed for the purpose of improving the approach. Several of the underlying assumptions within the algorithm were being violated, which was the source of the false-positives. Furthermore, the lack of an automatic evaluation process was thought to potentially be an impediment to widespread use. Recommendations for the improvement of the methodology were developed and preliminarily evaluated. These recommendations are believed to improve the efficacy of the damage-detection approach.