45 resultados para Three-state Potts model

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart homes for the aging population have recently started attracting the attention of the research community. The "health state" of smart homes is comprised of many different levels; starting with the physical health of citizens, it also includes longer-term health norms and outcomes, as well as the arena of positive behavior changes. One of the problems of interest is to monitor the activities of daily living (ADL) of the elderly, aiming at their protection and well-being. For this purpose, we installed passive infrared (PIR) sensors to detect motion in a specific area inside a smart apartment and used them to collect a set of ADL. In a novel approach, we describe a technology that allows the ground truth collected in one smart home to train activity recognition systems for other smart homes. We asked the users to label all instances of all ADL only once and subsequently applied data mining techniques to cluster in-home sensor firings. Each cluster would therefore represent the instances of the same activity. Once the clusters were associated to their corresponding activities, our system was able to recognize future activities. To improve the activity recognition accuracy, our system preprocessed raw sensor data by identifying overlapping activities. To evaluate the recognition performance from a 200-day dataset, we implemented three different active learning classification algorithms and compared their performance: naive Bayesian (NB), support vector machine (SVM) and random forest (RF). Based on our results, the RF classifier recognized activities with an average specificity of 96.53%, a sensitivity of 68.49%, a precision of 74.41% and an F-measure of 71.33%, outperforming both the NB and SVM classifiers. Further clustering markedly improved the results of the RF classifier. An activity recognition system based on PIR sensors in conjunction with a clustering classification approach was able to detect ADL from datasets collected from different homes. Thus, our PIR-based smart home technology could improve care and provide valuable information to better understand the functioning of our societies, as well as to inform both individual and collective action in a smart city scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied charge transport through core-substituted naphthalenediimide (NDI) single-molecule junctions using the electrochemical STM-based break-junction technique in combination with DFT calculations. Conductance switching among three well-defined states was demonstrated by electrochemically controlling the redox state of the pendent diimide unit of the molecule in an ionic liquid. The electrical conductances of the dianion and neutral states differ by more than one order of magnitude. The potential-dependence of the charge-transport characteristics of the NDI molecules was confirmed by DFT calculations, which account for electrochemical double-layer effects on the conductance of the NDI junctions. This study suggests that integration of a pendant redox unit with strong coupling to a molecular backbone enables the tuning of charge transport through single-molecule devices by controlling their redox states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Osteoarticular allograft transplantation is a popular treatment method in wide surgical resections with large defects. For this reason hospitals are building bone data banks. Performing the optimal allograft selection on bone banks is crucial to the surgical outcome and patient recovery. However, current approaches are very time consuming hindering an efficient selection. We present an automatic method based on registration of femur bones to overcome this limitation. We introduce a new regularization term for the log-domain demons algorithm. This term replaces the standard Gaussian smoothing with a femur specific polyaffine model. The polyaffine femur model is constructed with two affine (femoral head and condyles) and one rigid (shaft) transformation. Our main contribution in this paper is to show that the demons algorithm can be improved in specific cases with an appropriate model. We are not trying to find the most optimal polyaffine model of the femur, but the simplest model with a minimal number of parameters. There is no need to optimize for different number of regions, boundaries and choice of weights, since this fine tuning will be done automatically by a final demons relaxation step with Gaussian smoothing. The newly developed synthesis approach provides a clear anatomically motivated modeling contribution through the specific three component transformation model, and clearly shows a performance improvement (in terms of anatomical meaningful correspondences) on 146 CT images of femurs compared to a standard multiresolution demons. In addition, this simple model improves the robustness of the demons while preserving its accuracy. The ground truth are manual measurements performed by medical experts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proxy records and results of a three dimensional climate model show that European summer temperatures roughly a millennium ago were comparable to those of the last 25 years of the 20th century, supporting the existence of a summer "Medieval Warm Period" in Europe. Those two relatively mild periods were separated by a rather cold era, often referred to as the "Little Ice Age". Our modelling results suggest that the warm summer conditions during the early second millennium compared to the climate background state of the 13th–18th century are due to a large extent to the long term cooling induced by changes in land-use in Europe. During the last 200 years, the effect of increasing greenhouse gas concentrations, which was partly levelled off by that of sulphate aerosols, has dominated the climate history over Europe in summer. This induces a clear warming during the last 200 years, allowing summer temperature during the last 25 years to reach back the values simulated for the early second millennium. Volcanic and solar forcing plays a weaker role in this comparison between the last 25 years of the 20th century and the early second millennium. Our hypothesis appears consistent with proxy records but modelling results have to be weighted against the existing uncertainties in the external forcing factors, in particular related to land-use changes, and against the uncertainty of the regional climate sensitivity. Evidence for winter is more equivocal than for summer. The forced response in the model displays a clear temperature maximum at the end of the 20th century. However, the uncertainties are too large to state that this period is the warmest of the past millennium in Europe during winter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major barrier to widespread clinical implementation of Monte Carlo dose calculation is the difficulty in characterizing the radiation source within a generalized source model. This work aims to develop a generalized three-component source model (target, primary collimator, flattening filter) for 6- and 18-MV photon beams that match full phase-space data (PSD). Subsource by subsource comparison of dose distributions, using either source PSD or the source model as input, allows accurate source characterization and has the potential to ease the commissioning procedure, since it is possible to obtain information about which subsource needs to be tuned. This source model is unique in that, compared to previous source models, it retains additional correlations among PS variables, which improves accuracy at nonstandard source-to-surface distances (SSDs). In our study, three-dimensional (3D) dose calculations were performed for SSDs ranging from 50 to 200 cm and for field sizes from 1 x 1 to 30 x 30 cm2 as well as a 10 x 10 cm2 field 5 cm off axis in each direction. The 3D dose distributions, using either full PSD or the source model as input, were compared in terms of dose-difference and distance-to-agreement. With this model, over 99% of the voxels agreed within +/-1% or 1 mm for the target, within 2% or 2 mm for the primary collimator, and within +/-2.5% or 2 mm for the flattening filter in all cases studied. For the dose distributions, 99% of the dose voxels agreed within 1% or 1 mm when the combined source model-including a charged particle source and the full PSD as input-was used. The accurate and general characterization of each photon source and knowledge of the subsource dose distributions should facilitate source model commissioning procedures by allowing scaling the histogram distributions representing the subsources to be tuned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use long instrumental temperature series together with available field reconstructions of sea-level pressure (SLP) and three-dimensional climate model simulations to analyze relations between temperature anomalies and atmospheric circulation patterns over much of Europe and the Mediterranean for the late winter/early spring (January–April, JFMA) season. A Canonical Correlation Analysis (CCA) investigates interannual to interdecadal covariability between a new gridded SLP field reconstruction and seven long instrumental temperature series covering the past 250 years. We then present and discuss prominent atmospheric circulation patterns related to anomalous warm and cold JFMA conditions within different European areas spanning the period 1760–2007. Next, using a data assimilation technique, we link gridded SLP data with a climate model (EC-Bilt-Clio) for a better dynamical understanding of the relationship between large scale circulation and European climate. We thus present an alternative approach to reconstruct climate for the pre-instrumental period based on the assimilated model simulations. Furthermore, we present an independent method to extend the dynamic circulation analysis for anomalously cold European JFMA conditions back to the sixteenth century. To this end, we use documentary records that are spatially representative for the long instrumental records and derive, through modern analogs, large-scale SLP, surface temperature and precipitation fields. The skill of the analog method is tested in the virtual world of two three-dimensional climate simulations (ECHO-G and HadCM3). This endeavor offers new possibilities to both constrain climate model into a reconstruction mode (through the assimilation approach) and to better asses documentary data in a quantitative way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic scan planning for magnetic resonance imaging of the knee aims at defining an oriented bounding box around the knee joint from sparse scout images in order to choose the optimal field of view for the diagnostic images and limit acquisition time. We propose a fast and fully automatic method to perform this task based on the standard clinical scout imaging protocol. The method is based on sequential Chamfer matching of 2D scout feature images with a three-dimensional mean model of femur and tibia. Subsequently, the joint plane separating femur and tibia, which contains both menisci, can be automatically detected using an information-augmented active shape model on the diagnostic images. This can assist the clinicians in quickly defining slices with standardized and reproducible orientation, thus increasing diagnostic accuracy and also comparability of serial examinations. The method has been evaluated on 42 knee MR images. It has the potential to be incorporated into existing systems because it does not change the current acquisition protocol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consecrated in 1297 as the monastery church of the four years earlier founded St. Catherine’s monastery, the Gothic Church of St. Catherine was largely destroyed in a devastating bombing raid on January 2nd 1945. To counteract the process of disintegration, the departments of geo-information and lower monument protection authority of the City of Nuremburg decided to getting done a three dimensional building model of the Church of St. Catherine’s. A heterogeneous set of data was used for preparation of a parametric architectural model. In effect the modeling of historic buildings can profit from the so called BIM method (Building Information Modeling), as the necessary structuring of the basic data renders it into very sustainable information. The resulting model is perfectly suited to deliver a vivid impression of the interior and exterior of this former mendicant orders’ church to present observers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first result of the search for ν ( )μ( ) → ν ( )e( ) oscillations in the OPERA experiment, located at the Gran Sasso Underground Laboratory, is presented. The experiment looked for the appearance of ν ( )e( ) in the CNGS neutrino beam using the data collected in 2008 and 2009. Data are compatible with the non-oscillation hypothesis in the three-flavour mixing model. A further analysis of the same data constrains the non-standard oscillation parameters θ (new) and suggested by the LSND and MiniBooNE experiments. For large values (>0.1 eV(2)), the OPERA 90% C.L. upper limit on sin(2)(2θ (new)) based on a Bayesian statistical method reaches the value 7.2 × 10(−3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the southern part of Korup National Park, Cameroon, the mast fruiting tree Microberlinia bisulcata occurs as a codominant in groves of ectomycorrhizal Caesalpiniaceae within a mosaic of otherwise species-rich lowland rain forest. To estimate the amount of carbon and nutrients invested in reproduction during a mast fruiting event, and the consequential seed and seedling survival, three related field studies were made in 1995. These provided a complete seed and seedling budget for the cohort. Seed production was estimated by counting woody pods on the forest floor. Trees produced on average 26,000 (range 0-92,000) seeds/tree, with a dry mass of 16.6 kg/tree. Seeds were contained in woody pods of mass 307 kg/tree. Dry mass production of pods and seeds was 1034 kg ha(-1), equivalent to over half (55%) of annual leaf litterfall for this species, and contained 13% of the nitrogen and 21% of the phosphorus in annual leaf litterfall. Seed and young-seedling mortality was investigated with open quadrats and cages to exclude vertebrate predators, at two distances from the parent tree. The proportion of seeds on the forest floor which disappeared in the first 6 wk after dispersal was 84%, of which 26.5% was due to likely vertebrate removal, 36% to rotting, and 21.5% to other causes. Vertebrate predation was greater close to the stem than 5 m beyond the crown (41 vs 12% of seeds disappearing) where the seed shadow was less dense. Previous studies have demonstrated an association between mast years at Korup and high dry-season radiation before flowering, and have shown lower leaf-litterfall phosphorus concentrations following mast fruiting. The emerging hypothesis is that mast fruiting is primarily imposed by energy limitation for fruit production, but phosphorus supply and vertebrate predation are regulating factors. Recording the survival of naturally-regenerating M. bisulcata seedlings (6-wk stage) showed that 21% of seedlings survived to 31 mo. A simple three-stage recruitment model was constructed. Mortality rates were initially high and peaked again in each of the next two dry seasons, with smaller peaks in the two intervening wet seasons, these latter coinciding with annual troughs in radiation. The very poor recruitment of M. bisulcata trees in Korup, demonstrated in previous investigations, appears not to be due to a limitation in seed or young-seedling supply, but rather by factors operating at the established-seedling stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first result of the search for nu(mu)->nu(e) oscillations in the OPERA experiment, located at the Gran Sasso Underground Laboratory, is presented. The experiment looked for the appearance of nu(e) in the CNGS neutrino beam using the data collected in 2008 and 2009. Data are compatible with the non-oscillation hypothesis in the three-flavour mixing model. A further analysis of the same data constrains the non-standard oscillation parameters theta(new) and Delta m(new)(2) suggested by the LSND and MiniBooNE experiments. For large Delta m(new)(2) values (>0.1 eV(2)), the OPERA 90% C.L. upper limit on sin(2)(2 theta(new)) based on a Bayesian statistical method reaches the value 7.2 x 10(-3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. Methods A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemipelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Results Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Conclusions Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA.