980 resultados para Close-approach maneuvers


Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major challenges for a mission to the Jovian system is the radiation tolerance of the spacecraft (S/C) and the payload. Moreover, being able to achieve science observations with high signal to noise ratios (SNR), while passing through the high flux radiation zones, requires additional ingenuity on the part of the instrument provider. Consequently, the radiation mitigation is closely intertwined with the payload, spacecraft and trajectory design, and requires a systems-level approach. This paper presents a design for the Io Volcano Observer (IVO), a Discovery mission concept that makes multiple close encounters with Io while orbiting Jupiter. The mission aims to answer key outstanding questions about Io, especially the nature of its intense active volcanism and the internal processes that drive it. The payload includes narrow-angle and wide-angle cameras (NAC and WAC), dual fluxgate magnetometers (FGM), a thermal mapper (ThM), dual ion and neutral mass spectrometers (INMS), and dual plasma ion analyzers (PIA). The radiation mitigation is implemented by drawing upon experiences from designs and studies for missions such as the Radiation Belt Storm Probes (RBSP) and Jupiter Europa Orbiter (JEO). At the core of the radiation mitigation is IVO's inclined and highly elliptical orbit, which leads to rapid passes through the most intense radiation near Io, minimizing the total ionizing dose (177 krads behind 100 mils of Aluminum with radiation design margin (RDM) of 2 after 7 encounters). The payload and the spacecraft are designed specifically to accommodate the fast flyby velocities (e.g. the spacecraft is radioisotope powered, remaining small and agile without any flexible appendages). The science instruments, which collect the majority of the high-priority data when close to Io and thus near the peak flux, also have to mitigate transient noise in their detectors. The cameras use a combination of shielding and CMOS detectors with extremely fast readout to mi- imize noise. INMS microchannel plate detectors and PIA channel electron multipliers require additional shielding. The FGM is not sensitive to noise induced by energetic particles and the ThM microbolometer detector is nearly insensitive. Detailed SNR calculations are presented. To facilitate targeting agility, all of the spacecraft components are shielded separately since this approach is more mass efficient than using a radiation vault. IVO uses proven radiation-hardened parts (rated at 100 krad behind equivalent shielding of 280 mils of Aluminum with RDM of 2) and is expected to have ample mass margin to increase shielding if needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessments of environmental and territorial justice are similar in that both assess whether empirical relations between the spatial arrangement of undesirable hazards (or desirable public goods and services) and socio-demographic groups are consistent with notions of social justice, evaluating the spatial distribution of benefits and burdens (outcome equity) and the process that produces observed differences (process equity. Using proximity to major highways in NYC as a case study, we review methodological issues pertinent to both fields and discuss choice and computation of exposure measures, but focus primarily on measures of inequity. We present inequity measures computed from the empirically estimated joint distribution of exposure and demographics and compare them to traditional measures such as linear regression, logistic regression and Theil’s entropy index. We find that measures computed from the full joint distribution provide more unified, transparent and intuitive operational definitions of inequity and show how the approach can be used to structure siting and decommissioning decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel technique to create a computerized fluoroscopy with zero-dose image updates for computer-assisted fluoroscopy-based close reduction and osteosynthesis of diaphyseal fracture of femurs. With the novel technique, repositioning of bone fragments during close fracture reduction will lead to image updates in each acquired imaging plane, which is equivalent to using several fluoroscopes simultaneously from different directions but without any X-ray radiation. Its application facilitates the whole fracture reduction and osteosynthesis procedure when combining with the existing leg length and antetorsion restoration methods and may result in great reduction of the X-ray radiation to the patient and to the surgical team. In this paper, we present the approach for achieving such a technique and the experimental results with plastic bones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medical errors originating in health care facilities are a significant source of preventable morbidity, mortality, and healthcare costs. Voluntary error report systems that collect information on the causes and contributing factors of medi- cal errors regardless of the resulting harm may be useful for developing effective harm prevention strategies. Some patient safety experts question the utility of data from errors that did not lead to harm to the patient, also called near misses. A near miss (a.k.a. close call) is an unplanned event that did not result in injury to the patient. Only a fortunate break in the chain of events prevented injury. We use data from a large voluntary reporting system of 836,174 medication errors from 1999 to 2005 to provide evidence that the causes and contributing factors of errors that result in harm are similar to the causes and contributing factors of near misses. We develop Bayesian hierarchical models for estimating the log odds of selecting a given cause (or contributing factor) of error given harm has occurred and the log odds of selecting the same cause given that harm did not occur. The posterior distribution of the correlation between these two vectors of log-odds is used as a measure of the evidence supporting the use of data from near misses and their causes and contributing factors to prevent medical errors. In addition, we identify the causes and contributing factors that have the highest or lowest log-odds ratio of harm versus no harm. These causes and contributing factors should also be a focus in the design of prevention strategies. This paper provides important evidence on the utility of data from near misses, which constitute the vast majority of errors in our data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of modern transmission electron microscopy (TEM) in life science is to observe biological structures in a state as close as possible to the living organism. TEM samples have to be thin and to be examined in vacuum; therefore only solid samples can be investigated. The most common and popular way to prepare samples for TEM is to subject them to chemical fixation, staining, dehydration, and embedding in a resin (all of these steps introduce considerable artifacts) before investigation. An alternative is to immobilize samples by cooling. High pressure freezing is so far the only approach to vitrify (water solidification without ice crystal formation) bulk biological samples of about 200 micrometer thick. This method leads to an improved ultrastructural preservation. After high pressure freezing, samples have to be subjected to follow-up procedure, such as freeze-substitution and embedding. The samples can also be sectioned into frozen hydrated sections and analyzed in a cryo-TEM. Also for immunocytochemistry, high pressure freezing is a good and practicable way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The report examines the relationship between day care institutions, schools and so called “parents unfamiliar to education” as well as the relationship between the institutions. With in Danish public and professional discourse concepts like parents unfamiliar to education are usually referring to environments, parents or families with either no or just very restricted experience of education except for the basic school (folkeskole). The “grand old man” of Danish educational research, Prof. Em. Erik Jørgen Hansen, defines the concept as follows: Parents who are distant from or not familiar with education, are parents without tradition of education and by that fact they are not able to contribute constructively in order to back up their own children during their education. Many teachers and pedagogues are not used to that term; they rather prefer concepts like “socially exposed” or “socially disadvantaged” parents or social classes or strata. The report does not only focus on parents who are not capable to support the school achievements of their children, since a low level of education is usually connected with social disadvantage. Such parents are often not capable of understanding and meeting the demands from side of the school when sending their children to school. They lack the competencies or the necessary competence of action. For the moment being much attention is done from side of the Ministries of Education and Social Affairs (recently renamed Ministry of Welfare) in order to create equal possibilities for all children. Many kinds of expertise (directions, counsels, researchers, etc.) have been more than eager to promote recommendations aiming at achieving the ambitious goal: 2015 95% of all young people should complement a full education (classes 10.-12.). Research results are pointing out the importance of increased participation of parents. In other word the agenda is set for ‘parents’ education’. It seems necessary to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and effects, and by implementing a new wage policy depending on achievements and/or effects a new system of accountability is manufactured. The consequences are already perceptible. The government decides to do some special interventions concerning parents, children or youngsters, the public servants on municipality level are instructed to carry out their services by following a manual, and the parents are no longer protected by privacy. Protection of privacy and minority is no longer a valuable argumentation to prevent further interventions in people’s life (health, food, school, etc.). The citizens are becoming objects of investment, also implying that people are investing in their own health, education, and family. This means that investments in changes of life style and development of competences go hand in hand. The below mentioned programmes are conditioned by this shift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrochemical reactivity and structure properties of electrogenic bacteria, Geobacter sulfurreducens (Gs) were studied to explore the heterogeneous electron transfer at the bacteria/electrode interface using electrochemical and in-situ spectroscopic techniques. The redox behavior of Gs adsorbed on a gold electrode, which is modified with a ω-functionalized self-assembled monolayer (SAM) of alkanethiols, depends strongly on the terminal group. The latter interacts directly with outermost cytochromes embedded into the outer membrane of the Gs cells. The redox potential of bacterial cells bound electrostatically to a carboxyl-terminated SAM is close to that observed for bacteria attached to a bare gold electrode, revealing a high electronic coupling at the cell/SAM interface. The redox potentials of bacterial cells adsorbed on amino- and pyridyl-terminated SAMs are significantly different suggesting that the outermost cytochromes changes their conformation upon adsorption on these SAMs. No redox activity of Gs was found with CH3-, N(CH3)3+- and OH-terminated SAMs. Complementary in-situ spectroscopic studies on bacteria/SAMs/Au electrode assemblies were carried out to monitor structure changes of the bacterial cells upon polarization. Spectro-electrochemical techniques revealed the electrochemical turnover of the oxidized and reduced states of outer membrane cytochromes (OMCs) in Gs, providing evidence that the OMCs are responsible for the direct electron transfer to metal electrodes, such as gold or silver, during the electricity production. Furthermore, we observed spectroscopic signatures of the native structure of the OMCs and no conformational change during the oxidation/reduction process of the microorganisms. These findings indicate that the carboxyl-anchoring group provides biocompatible conditions for the outermost cytochromes of the Gs, which facilitate the heterogeneous electron transfer at the microorganism/electrode interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmaceuticals are ubiquitous in surface waters as a consequence of discharges from municipal wastewater treatment plants. However, few studies have assessed the bioavailability of pharmaceuticals to fish in natural waters. In the present study, passive samplers and rainbow trout were experimentally deployed next to three municipal wastewater treatment plants in Finland to evaluate the degree of animal exposure. Pharmaceuticals from several therapeutic classes (in total 15) were analyzed by liquid chromatography-tandem mass spectrometry in extracts of passive samplers and in bile and blood plasma of rainbow trout held at polluted sites for 10 d. Each approach indicated the highest exposure near wastewater treatment plant A and the lowest near that of plant C. Diclofenac, naproxen, and ibuprofen were found in rainbow trout, and their concentrations in bile were 10 to 400 times higher than in plasma. The phase I metabolite hydroxydiclofenac was also detected in bile. Hence, bile proved to be an excellent sample matrix for the exposure assessment of fish. Most of the monitored pharmaceuticals were found in passive samplers, implying that they may overestimate the actual exposure of fish in receiving waters. Two biomarkers, hepatic vitellogenin and cytochrome P4501A, did not reveal clear effects on fish, although a small induction of vitellogenin mRNA was observed in trout caged near wastewater treatment plants B and C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When observers are presented with two visual targets appearing in the same position in close temporal proximity, a marked reduction in detection performance of the second target has often been reported, the so-called attentional blink phenomenon. Several studies found a similar decrement of P300 amplitudes during the attentional blink period as observed with detection performances of the second target. However, whether the parallel courses of second target performances and corresponding P300 amplitudes resulted from the same underlying mechanisms remained unclear. The aim of our study was therefore to investigate whether the mechanisms underlying the AB can be assessed by fixed-links modeling and whether this kind of assessment would reveal the same or at least related processes in the behavioral and electrophysiological data. On both levels of observation three highly similar processes could be identified: an increasing, a decreasing and a u-shaped trend. Corresponding processes from the behavioral and electrophysiological data were substantially correlated, with the two u-shaped trends showing the strongest association with each other. Our results provide evidence for the assumption that the same mechanisms underlie attentional blink task performance at the electrophysiological and behavioral levels as assessed by fixed-links models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of efficient hydrological risk mitigation strategies and their subsequent implementation relies on a careful vulnerability analysis of the elements exposed. Recently, extensive research efforts were undertaken to develop and refine empirical relationships linking the structural vulnerability of buildings to the impact forces of the hazard processes. These empirical vulnerability functions allow estimating the expected direct losses as a result of the hazard scenario based on spatially explicit representation of the process patterns and the elements at risk classified into defined typological categories. However, due to the underlying empiricism of such vulnerability functions, the physics of the damage-generating mechanisms for a well-defined element at risk with its peculiar geometry and structural characteristics remain unveiled, and, as such, the applicability of the empirical approach for planning hazard-proof residential buildings is limited. Therefore, we propose a conceptual assessment scheme to close this gap. This assessment scheme encompasses distinct analytical steps: modelling (a) the process intensity, (b) the impact on the element at risk exposed and (c) the physical response of the building envelope. Furthermore, these results provide the input data for the subsequent damage evaluation and economic damage valuation. This dynamic assessment supports all relevant planning activities with respect to a minimisation of losses, and can be implemented in the operational risk assessment procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To analyze the closure, persistence or reopening of the maxillary midline diastema after frenectomy in patients with and without subsequent orthodontic treatment. METHOD AND MATERIALS All patients undergoing frenectomy with a CO2 laser were included in this retrospective study during the period of September 2002 to June 2011. Age and sex, the dimension of the diastema, eruption status of the maxillary canines, and the presence of an orthodontic treatment were recorded at the day of frenectomy and during follow-up. RESULTS Of the 59 patients fulfilling the inclusion criteria, 31 (52.5%) had an active orthodontic therapy, while 27 (45.8%) had a frenectomy without orthodontic treatment. For one patient, information concerning orthodontic treatment was not available. In the first follow-up (2 to 12 weeks), only four diastemas closed after frenectomy and orthodontic treatment, and none after frenectomy alone. In the second follow-up (4 to 19 months), statistically significantly (P = .002) more diastemas (n = 20) closed with frenectomy and orthodontic treatment than with frenectomy alone (n = 3). At the long-term (21 to 121 months) follow-up, only four patients had a persisting diastema, and in three patients orthodontic treatment was ongoing. CONCLUSION Closure of the maxillary midline diastema with a prominent frenum is more predictable with frenectomy and concomitant orthodontic treatment than with frenectomy alone. This study demonstrates the importance of an interdisciplinary approach to treat maxillary midline diastemas, ideally including general practitioners, oral surgeons, periodontists, and orthodontists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function m(ω) only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter α in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of P[ρ|D] in the full Nω » Nτ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width delta peaks and more realistic scenarios, based on the perturbative Euclidean Wilson Loop as well as the Wilson Line correlator in Coulomb gauge.