103 resultados para Building simulation
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
In this work we present numerical simulations of continuous flow left ventricle assist device implantation with the aim of comparing difference in flow rates and pressure patterns depending on the location of the anastomosis and the rotational speed of the device. Despite the fact that the descending aorta anastomosis approach is less invasive, since it does not require a sternotomy and a cardiopulmonary bypass, its benefits are still controversial. Moreover, the device rotational speed should be correctly chosen to avoid anomalous flow rates and pressure distribution in specific location of the cardiovascular tree. With the aim of assessing the differences between these two approaches and device rotational speed in terms of flow rate and pressure waveforms, we set up numerical simulations of network of one-dimensional models where we account for the presence of an outflow cannula anastomosed to different locations of the aorta. Then, we use the resulting network to compare the results of the two different cannulations for several stages of heart failure and different rotational speed of the device. The inflow boundary data for the heart and the cannulas are obtained from a lumped parameters model of the entire circulatory system with an assist device, which is validated with clinical data. The results show that ascending and descending aorta cannulations lead to similar waveforms and mean flow rate in all the considered cases. Moreover, regardless of the anastomosis region, the rotational speed of the device has an important impact on wave profiles; this effect is more pronounced at high RPM.
Resumo:
The identification of genetically homogeneous groups of individuals is a long standing issue in population genetics. A recent Bayesian algorithm implemented in the software STRUCTURE allows the identification of such groups. However, the ability of this algorithm to detect the true number of clusters (K) in a sample of individuals when patterns of dispersal among populations are not homogeneous has not been tested. The goal of this study is to carry out such tests, using various dispersal scenarios from data generated with an individual-based model. We found that in most cases the estimated 'log probability of data' does not provide a correct estimation of the number of clusters, K. However, using an ad hoc statistic DeltaK based on the rate of change in the log probability of data between successive K values, we found that STRUCTURE accurately detects the uppermost hierarchical level of structure for the scenarios we tested. As might be expected, the results are sensitive to the type of genetic marker used (AFLP vs. microsatellite), the number of loci scored, the number of populations sampled, and the number of individuals typed in each sample.
Resumo:
The pharmacokinetic determinants of successful antibiotic prophylaxis of endocarditis are not precisely known. Differences in half-lives of antibiotics between animals and humans preclude extrapolation of animal results to human situations. To overcome this limitation, we have mimicked in rats the amoxicillin kinetics in humans following a 3-g oral dose (as often used for prophylaxis of endocarditis) by delivering the drug through a computerized pump. Rats with catheter-induced vegetations were challenged with either of two strains of antibiotic-tolerant viridans group streptococci. Antibiotics were given either through the pump (to simulate the whole kinetic profile during prophylaxis in humans) or as an intravenous bolus which imitated only the peak level of amoxicillin (18 mg/liter) in human serum. Prophylaxis by intravenous bolus was inoculum dependent and afforded a limited protection only in rats challenged with the minimum inoculum size infecting > or = 90% of untreated controls. In contrast, simulation of kinetics in humans significantly protected animals challenged with 10 to 100 times the inoculum of either of the test organisms infecting > or = 90% of untreated controls. Thus, simulation of the profiles of amoxicillin prophylaxis in human serum was more efficacious than mere imitation of the transient peak level in rats. This confirms previous studies suggesting that the duration for which the serum amoxicillin level remained detectable (not only the magnitude of the peak) was an important parameter in successful prophylaxis of endocarditis. The results also suggest that single-dose prophylaxis with 3 g of amoxicillin in humans might be more effective than predicted by conventional animal models in which only peak levels of antibiotic in human serum were stimulated.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Introduction: Exposure to environmental tobacco smoke (ETS) is a major environmental risk factor. Indoor contaminants come from a variety of sources, which can include inadequate ventilation, volatile organic compounds (VOCs), biological agents, combustion products, and ETS. Because ETS is one of the most frequent causes of IAQ complaints as well as the high mortality of passive smoking, in June 2004 the University of Geneva made the decision to ban smoking inside the so called "Uni-Mail" building, the biggest Swiss University human science building of recent construction, and the ordinance was applied beginning in October 2004. This report presents the finding related to the IAQ of the "Uni-Mail" building before and after smoking bans using nicotine, suspended dust, condensate and PAHs level in air as tracers to perform an assessment of passive tobacco exposure for non-smokers inside the building. Methods: Respirable particles (RSP) A real time aerosol monitor (model DataRAM)was place at sampling post 1, level ground floor. Condensate It consists in extracting any organic matter taken on the glass fibre filters by MeOH, and then measuring the total absorbent of the MeOH extract to the UV wavelength of 447 nm. Nicotine Nicotine was taken by means of cartridges containing of XAD-4 to the fixed flow of 0.5 L/min. The analytical method used for the determination of nicotine is based on gas chromatography with Nitrogen selective detector GC-NPD. Results: Figure 1 shows the box plot density display of 3 parameters before and after smoking bans for all 7 sampling posts: dust, condensate and nicotine in air in μg/m3. Conclusion: Before the smoking ban, the level of the concentrations of respirable particles (RSP) is raised more, average of the day 320 μg/m3, with peaks of more than 1000 μg/m3, compared with the values of the surrounding air between 22 and 30 μg/m3. The nicotine level is definitely more important (average 5.53 μg/m3, field 1.5 to 17.9 μg/m3). Once the smoking bans inside the building were applied, one notes a clear improvement in terms of concentrations of pollutants. For dust, the concentration fell by 3 times (average: 130 μg/m3, range: 40 to 160 μg/m3) and that of nicotine by 10 times (average: 0.53 μg/m3, range: 0 to 1.69 μg/m3) compared to that found before smoking bans. The outdoor air RSP concentration was 22 μg/m3 or 10 times lower. Nicotine seems to be the best tracer for ETS free of interference, independent of location or season.
Resumo:
Excessive exposure to solar UV light is the main cause of skin cancers in humans. UV exposure depends on environmental as well as individual factors related to activity. Although outdoor occupational activities contribute significantly to the individual dose received, data on effective exposure are scarce and limited to a few occupations. A study was undertaken in order to assess effective short-term exposure among building workers and characterize the influence of individual and local factors on exposure. The effective exposure of construction workers in a mountainous area in the southern part of Switzerland was investigated through short-term dosimetry (97 dosimeters). Three altitudes, of about 500, 1500 and 2500 m were considered. Individual measurements over 20 working periods were performed using Spore film dosimeters on five body locations. The postural activity of workers was concomitantly recorded and static UV measurements were also performed. Effective exposure among building workers was high and exceeded occupational recommendations, for all individuals for at least one body location. The mean daily UV dose in plain was 11.9 SED (0.0-31.3 SED), in middle mountain 21.4 SED (6.6-46.8 SED) and in high mountain 28.6 SED (0.0-91.1 SED). Measured doses between workers and anatomical locations exhibited a high variability, stressing the role of local exposure conditions and individual factors. Short-term effective exposure ranged between 0 and 200% of ambient irradiation, indicating the occurrence of intense, subacute exposures. A predictive irradiation model was developed to investigate the role of individual factors. Posture and orientation were found to account for at least 38% of the total variance of relative individual exposure, and were also found to account more than altitude on the total variance of effective daily exposures. Targeted sensitization actions through professional information channels and specific prevention messages are recommended. Altitude outdoor workers should also benefit from preventive medical examination.
Resumo:
Le modèle développé à l'Institut universitaire de médecine sociale et préventive de Lausanne utilise un programme informatique pour simuler les mouvements d'entrées et de sorties des hôpitaux de soins généraux. Cette simulation se fonde sur les données récoltées de routine dans les hôpitaux; elle tient notamment compte de certaines variations journalières et saisonnières, du nombre d'entrées, ainsi que du "Case-Mix" de l'hôpital, c'est-à-dire de la répartition des cas selon les groupes cliniques et l'âge des patients.
Resumo:
The simultaneous recording of scalp electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) can provide unique insights into the dynamics of human brain function, and the increased functional sensitivity offered by ultra-high field fMRI opens exciting perspectives for the future of this multimodal approach. However, simultaneous recordings are susceptible to various types of artifacts, many of which scale with magnetic field strength and can seriously compromise both EEG and fMRI data quality in recordings above 3T. The aim of the present study was to implement and characterize an optimized setup for simultaneous EEG-fMRI in humans at 7T. The effects of EEG cable length and geometry for signal transmission between the cap and amplifiers were assessed in a phantom model, with specific attention to noise contributions from the MR scanner coldheads. Cable shortening (down to 12cm from cap to amplifiers) and bundling effectively reduced environment noise by up to 84% in average power and 91% in inter-channel power variability. Subject safety was assessed and confirmed via numerical simulations of RF power distribution and temperature measurements on a phantom model, building on the limited existing literature at ultra-high field. MRI data degradation effects due to the EEG system were characterized via B0 and B1(+) field mapping on a human volunteer, demonstrating important, although not prohibitive, B1 disruption effects. With the optimized setup, simultaneous EEG-fMRI acquisitions were performed on 5 healthy volunteers undergoing two visual paradigms: an eyes-open/eyes-closed task, and a visual evoked potential (VEP) paradigm using reversing-checkerboard stimulation. EEG data exhibited clear occipital alpha modulation and average VEPs, respectively, with concomitant BOLD signal changes. On a single-trial level, alpha power variations could be observed with relative confidence on all trials; VEP detection was more limited, although statistically significant responses could be detected in more than 50% of trials for every subject. Overall, we conclude that the proposed setup is well suited for simultaneous EEG-fMRI at 7T.