994 resultados para Theoretical values


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neglecting health effects from indoor pollutant emissions and exposure, as currently done in Life Cycle Assessment (LCA), may result in product or process optimizations at the expense of workers' or consumers' health. To close this gap, methods for considering indoor exposure to chemicals are needed to complement the methods for outdoor human exposure assessment already in use. This paper summarizes the work of an international expert group on the integration of human indoor and outdoor exposure in LCA, within the UNEP/ SETAC Life Cycle Initiative. A new methodological framework is proposed for a general procedure to include human-health effects from indoor exposure in LCA. Exposure models from occupational hygiene and household indoor air quality studies and practices are critically reviewed and recommendations are provided on the appropriateness of various model alternatives in the context of LCA. A single-compartment box model is recommended for use as a default in LCA, enabling one to screen occupational and household exposures consistent with the existing models to assess outdoor emission in a multimedia environment. An initial set of model parameter values was collected. The comparison between indoor and outdoor human exposure per unit of emission shows that for many pollutants, intake per unit of indoor emission may be several orders of magnitude higher than for outdoor emissions. It is concluded that indoor exposure should be routinely addressed within LCA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study the average inter-crossing number between two random walks and two random polygons in the three-dimensional space. The random walks and polygons in this paper are the so-called equilateral random walks and polygons in which each segment of the walk or polygon is of unit length. We show that the mean average inter-crossing number ICN between two equilateral random walks of the same length n is approximately linear in terms of n and we were able to determine the prefactor of the linear term, which is a = (3 In 2)/(8) approximate to 0.2599. In the case of two random polygons of length n, the mean average inter-crossing number ICN is also linear, but the prefactor of the linear term is different from that of the random walks. These approximations apply when the starting points of the random walks and polygons are of a distance p apart and p is small compared to n. We propose a fitting model that would capture the theoretical asymptotic behaviour of the mean average ICN for large values of p. Our simulation result shows that the model in fact works very well for the entire range of p. We also study the mean ICN between two equilateral random walks and polygons of different lengths. An interesting result is that even if one random walk (polygon) has a fixed length, the mean average ICN between the two random walks (polygons) would still approach infinity if the length of the other random walk (polygon) approached infinity. The data provided by our simulations match our theoretical predictions very well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All of the imputation techniques usually applied for replacing values below thedetection limit in compositional data sets have adverse effects on the variability. In thiswork we propose a modification of the EM algorithm that is applied using the additivelog-ratio transformation. This new strategy is applied to a compositional data set and theresults are compared with the usual imputation techniques

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this paper aims at developing a methodology that takes into account the human factor extracted from the data base used by the recommender systems, and which allow to resolve the specific problems of prediction and recommendation. In this work, we propose to extract the user's human values scale from the data base of the users, to improve their suitability in open environments, such as the recommender systems. For this purpose, the methodology is applied with the data of the user after interacting with the system. The methodology is exemplified with a case study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION Monotherapy against HIV has undoubted theoretical advantages and has good scientific fundaments. However, it is still controversial and here we will analyze the efficacy and safety of MT with darunavir with ritonavir (DRV/r) on patients who have received this treatment in our hospitals. MATERIALS AND METHODS Observational retrospective study that includes patients from 10 Andalusian hospitals that have received DRV/r in MT and that have been followed over a minimum of 12 months. We carried out a statistical descriptive analysis based on the profile of patients who had been prescribed MT and the efficacy and safety that were observed, paying special attention to treatment failure and virological evolution. RESULTS DRV/r was prescribed to 604 patients, of which 41.1% had a CD4 nadir <200/mmc. 33.1% had chronic hepatitis caused by HCV, had received an average of five lines of previous treatment and had a history of treatment failure to analogues in 33%, to non-analogues 22 and protease inhibitors (PI) in 19.5%. 76.6% proceeded from a previous treatment with PI. The simplification was the main criteria for the instauration of MT in the 81.5% and the adverse effects in the 18.5%. We managed to maintain MT in 84% of cases, with only 4.8% of virological failure (VF) with viral load (VL) >200 c/mL and 3.6% additional losses due to VF with VL between 50 and 200 copies/mL. Thirty three genotypes were performed after failure without findings of resistance mutations to DRV/r or other IPs. Only 23.7% of patients presented some blips during the period of exposition to MT. Eighty seven percent of all determinations of VL had <50 copies/mL, and only 4.99% had >200 copies/mL. Although up to 14.9% registered at some point an AE, only 2.6% abandoned MT because of AE and 1.2% because of voluntary decision. Although the average of total and LDL cholesterol increases 10 mg/dL after 2 years of follow-up, so did HDL cholesterol in 3mg/dL and the values of triglycerides (-14 mg/dL) and GPT (-6 UI/mL) decreased. The average count of CD4 lymphocytes increased from 642 to 714/mm(3) at 24 weeks. CONCLUSIONS In a very broad series of patients obtained from clinical practice, data from clinical trials was confirmed: MT with DRV as a de-escalation strategy is very safe, it's associated to a negligible rate of adverse effects and maintains a good suppression of HIV replication. VF (with >50 or >200 copies/mL) is always under 10% and in any case without consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. While ground UV irradiance is monitored via different techniques, it is difficult to translate such observations into human UV exposure or dose because of confounding factors. A multi-disciplinary collaboration developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a simulation tool that estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. Dosimetric measurements obtained in field conditions were used to assess the model performance. The model predicted exposure to solar UV adequately with a symmetric mean absolute percentage error of 13% and half of the predictions within 17% range of the measurements. Using this tool, solar UV exposure patterns were investigated with respect to the relative contribution of the direct, diffuse and reflected radiation. Exposure doses for various body parts and exposure scenarios of a standing individual were assessed using erythemally-weighted UV ground irradiance data measured in 2009 at Payerne, Switzerland as input. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 Standard Erythemal Dose, SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e. g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Home blood pressure (BP) monitoring is recommended by several clinical guidelines and has been shown to be feasible in elderly persons. Wrist manometers have recently been proposed for such home BP measurement, but their accuracy has not been previously assessed in elderly patients. METHODS: Forty-eight participants (33 women and 15 men, mean age 81.3±8.0 years) had their BP measured with a wrist device with position sensor and an arm device in random order in a sitting position. RESULTS: Average BP measurements were consistently lower with the wrist than arm device for systolic BP (120.1±2.2 vs. 130.5±2.2 mmHg, P<0.001, means±SD) and diastolic BP (66.0±1.3 vs. 69.7±1.3 mmHg, P<0.001). Moreover, a 10 mmHg or greater difference between the arm and wrist device was observed in 54.2 and 18.8% of systolic and diastolic measures, respectively. CONCLUSION: Compared with the arm device, the wrist device with position sensor systematically underestimated systolic as well as diastolic BP. The magnitude of the difference is clinically significant and questions the use of the wrist device to monitor BP in elderly persons. This study points to the need to validate BP measuring devices in all age groups, including in elderly persons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behaviour of the harmonic infrared frequency of diatomic molecules subjected to moderate static uniform electric fields is analysed. The potential energy expression has been developed as a function of a static uniform electric field, which brings about a formulation describing the frequency versus field strength curve. With the help of the first and second derivatives of the expressions obtained, which correspond to the first- and second-order Stark effects, it was possible to find the maxima of the frequency versus field strength curves for a series of molecules using a Newton-Raphson search. A method is proposed which requires only the calculation of a few energy derivatives at a particular value of the field strength. At the same time, the expression for the dependence of the interatomic distance on the electric field strength is derived and the minimum of this curve is found for the same species. Derived expressions and numerical results are discussed and compared with other studi

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Excessive exposure to solar Ultra-Violet (UV) light is the main cause of most skin cancers in humans. Factors such as the increase of solar irradiation at ground level (anthropic pollution), the rise in standard of living (vacation in sunny areas), and (mostly) the development of outdoor activities have contributed to increase exposure. Thus, unsurprisingly, incidence of skin cancers has increased over the last decades more than that of any other cancer. Melanoma is the most lethal cutaneous cancer, while cutaneous carcinomas are the most common cancer type worldwide. UV exposure depends on environmental as well as individual factors related to activity. The influence of individual factors on exposure among building workers was investigated in a previous study. Posture and orientation were found to account for at least 38% of the total variance of relative individual exposure. A high variance of short-term exposure was observed between different body locations, indicating the occurrence of intense, subacute exposures. It was also found that effective short-term exposure ranged between 0 and 200% of ambient irradiation, suggesting that ambient irradiation is a poor predictor of effective exposure. Various dosimetric techniques enable to assess individual effective exposure, but dosimetric measurements remain tedious and tend to be situation-specific. As a matter of facts, individual factors (exposure time, body posture and orientation in the sun) often limit the extrapolation of exposure results to similar activities conducted in other conditions. Objective: The research presented in this paper aims at developing and validating a predictive tool of effective individual exposure to solar UV. Methods: Existing computer graphic techniques (3D rendering) were adapted to reflect solar exposure conditions and calculate short-term anatomical doses. A numerical model, represented as a 3D triangular mesh, is used to represent the exposed body. The amount of solar energy received by each "triangle is calculated, taking into account irradiation intensity, incidence angle and possible shadowing from other body parts. The model take into account the three components of the solar irradiation (direct, diffuse and albedo) as well as the orientation and posture of the body. Field measurements were carried out using a forensic mannequin at the Payerne MeteoSwiss station. Short-term dosimetric measurements were performed in 7 anatomical locations for 5 body postures. Field results were compared to the model prediction obtained from the numerical model. Results: The best match between prediction and measurements was obtained for upper body parts such as shoulders (Ratio Modelled/Measured; Mean = 1.21, SD = 0.34) and neck (Mean = 0.81, SD = 0.32). Small curved body parts such as forehead (Mean = 6.48, SD = 9.61) exhibited a lower matching. The prediction is less accurate for complex postures such as kneeling (Mean = 4.13, SD = 8.38) compared to standing up (Mean = 0.85, SD = 0.48). The values obtained from the dosimeters and the ones computed from the model are globally consistent. Conclusion: Although further development and validation are required, these results suggest that effective exposure could be predicted for a given activity (work or leisure) in various ambient irradiation conditions. Using a generic modelling approach is of high interest in terms of implementation costs as well as predictive and retrospective capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The neo-liberal capitalist ideology has come under heavy fire with anecdotal evidence indicating a link between these same values and unethical behavior. Academic institutions reflect social values and act as socializing agents for the young. Can this explain the high and increasing rates of cheating that currently prevail in education? Our first chapter examines the question of whether self-enhancement values of power and açhievement, the individual level equivalent of neo-liberal capitalist values, predict positive attitudes towards cheating. Furthermore, we explore the mediating role of motivational factors. Results of four studies reveal that self-enhancement value endorsement predicts the adoption of performance-approach goals, a relationship mediated by introjected regulation, namely desire for social approval and that self-enhancement value endorsement also predicts the condoning of cheating, a relationship mediated by performance-approach goal adoption. However, self-transcendence values prescribed by a normatively salient source have the potential to reduce the link between self-enhancement value endorsément and attitudes towards cheating. Normative assessment constitutes a key tool used by academic institutions to socialize young people to accept the competitive, meritocratic nature of a sociéty driven by a neo-liberal capitalist ideology. As such, the manifest function of grades is to motivate students to work hard and to buy into the competing ethos. Does normative assessment fulfill these functions? Our second chapter explores the reward-intrinsic motivation question in the context of grading, arguably a high-stakes reward. In two experiments, the relative capacity of graded high performance as compared to the task autonomy experienced in an ungraded task to predict post-task intrinsic motivation is assessed. Results show that whilst the graded task performance predicts post-task appreciation, it fails to predict ongoing motivation. However, perceived autonomy experienced in non-graded condition, predicts both post-task appreciation and ongoing motivation. Our third chapter asks whether normative assessment inspires the spirit of competition in students. Results of three experimental studies reveal that expectation of a grade for a task, compared to no grade, induces greater adoption of performance-avoidance, but not performance-approach, goals. Experiment 3 provides an explanatory mechanism for this, showing that reduced autonomous motivation experienced in previous graded tasks mediates the relationship between grading and adoption of performance avoidance goals in a subsequent task. The above results, when combined, provide evidence as to the deleterious effects of self enhancement values and the associated practice of normative assessment in school on student motivation, goals and ethics. We conclude by using value and motivation theory to explore solutions to this problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Transgressive segregation describes the occurrence of novel phenotypes in hybrids with extreme trait values not observed in either parental species. A previously experimentally untested prediction is that the amount of transgression increases with the genetic distance between hybridizing species. This follows from QTL studies suggesting that transgression is most commonly due to complementary gene action or epistasis, which become more frequent at larger genetic distances. This is because the number of QTLs fixed for alleles with opposing signs in different species should increase with time since speciation provided that speciation is not driven by disruptive selection. We measured the amount of transgression occurring in hybrids of cichlid fish bred from species pairs with gradually increasing genetic distances and varying phenotypic similarity. Transgression in multi-trait shape phenotypes was quantified using landmark-based geometric morphometric methods. RESULTS: We found that genetic distance explained 52% and 78% of the variation in transgression frequency in F1 and F2 hybrids, respectively. Confirming theoretical predictions, transgression when measured in F2 hybrids, increased linearly with genetic distance between hybridizing species. Phenotypic similarity of species on the other hand was not related to the amount of transgression. CONCLUSION: The commonness and ease with which novel phenotypes are produced in cichlid hybrids between unrelated species has important implications for the interaction of hybridization with adaptation and speciation. Hybridization may generate new genotypes with adaptive potential that did not reside as standing genetic variation in either parental population, potentially enhancing a population's responsiveness to selection. Our results make it conceivable that hybridization contributed to the rapid rates of phenotypic evolution in the large and rapid adaptive radiations of haplochromine cichlids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The radiation distribution function used by Domínguez and Jou [Phys. Rev. E 51, 158 (1995)] has been recently modified by Domínguez-Cascante and Faraudo [Phys. Rev. E 54, 6933 (1996)]. However, in these studies neither distribution was written in terms of directly measurable quantities. Here a solution to this problem is presented, and we also propose an experiment that may make it possible to determine the distribution function of nonequilibrium radiation experimentally. The results derived do not depend on a specific distribution function for the matter content of the system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[1] We present new analytical data of major and trace elements for the geological MPI-DING glasses KL2-G, ML3B-G, StHs6/80-G, GOR128-G, GOR132-G, BM90/21-G, T1-G, and ATHO-G. Different analytical methods were used to obtain a large spectrum of major and trace element data, in particular, EPMA, SIMS, LA-ICPMS, and isotope dilution by TIMS and ICPMS. Altogether, more than 60 qualified geochemical laboratories worldwide contributed to the analyses, allowing us to present new reference and information values and their uncertainties ( at 95% confidence level) for up to 74 elements. We complied with the recommendations for the certification of geological reference materials by the International Association of Geoanalysts (IAG). The reference values were derived from the results of 16 independent techniques, including definitive ( isotope dilution) and comparative bulk ( e. g., INAA, ICPMS, SSMS) and microanalytical ( e. g., LA-ICPMS, SIMS, EPMA) methods. Agreement between two or more independent methods and the use of definitive methods provided traceability to the fullest extent possible. We also present new and recently published data for the isotopic compositions of H, B, Li, O, Ca, Sr, Nd, Hf, and Pb. The results were mainly obtained by high-precision bulk techniques, such as TIMS and MC-ICPMS. In addition, LA-ICPMS and SIMS isotope data of B, Li, and Pb are presented.