631 resultados para algorithmic skeletons
Resumo:
Sequential studies of osteopenic bone disease in small animals require the availability of non-invasive, accurate and precise methods to assess bone mineral content (BMC) and bone mineral density (BMD). Dual-energy X-ray absorptiometry (DXA), which is currently used in humans for this purpose, can also be applied to small animals by means of adapted software. Precision and accuracy of DXA was evaluated in 10 rats weighing 50-265 g. The rats were anesthetized with a mixture of ketamine-xylazine administrated intraperitoneally. Each rat was scanned six times consecutively in the antero-posterior incidence after repositioning using the rat whole-body software for determination of whole-body BMC and BMD (Hologic QDR 1000, software version 5.52). Scan duration was 10-20 min depending on rat size. After the last measurement, rats were sacrificed and soft tissues were removed by dermestid beetles. Skeletons were then scanned in vitro (ultra high resolution software, version 4.47). Bones were subsequently ashed and dissolved in hydrochloric acid and total body calcium directly assayed by atomic absorption spectrophotometry (TBCa[chem]). Total body calcium was also calculated from the DXA whole-body in vivo measurement (TBCa[DXA]) and from the ultra high resolution measurement (TBCa[UH]) under the assumption that calcium accounts for 40.5% of the BMC expressed as hydroxyapatite. Precision error for whole-body BMC and BMD (mean +/- S.D.) was 1.3% and 1.5%, respectively. Simple regression analysis between TBCa[DXA] or TBCa[UH] and TBCa[chem] revealed tight correlations (n = 0.991 and 0.996, respectively), with slopes and intercepts which were significantly different from 1 and 0, respectively.(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.
Resumo:
Geometrical dependencies are being researched for analytical representation of the probability density function (pdf) for the travel time between a random, and a known or another random point in Tchebyshev’s metric. In the most popular case - a rectangular area of service - the pdf of this random variable depends directly on the position of the server. Two approaches have been introduced for the exact analytical calculation of the pdf: Ad-hoc approach – useful for a ‘manual’ solving of a specific case; by superposition – an algorithmic approach for the general case. The main concept of each approach is explained, and a short comparison is done to prove the faithfulness.
Resumo:
The charge transport properties of a catechol-type dithiol-terminated oligo-phenylene-ethynylene was investigated by cyclic voltammetry (CV) and by the scanning tunnelling microscopy break junction technique (STM-BJ). Single molecule charge transport experiments demonstrated the existence of high and low conductance regions. The junction conductance is rather weakly dependent on the redox state of the bridging molecule. However, a distinct dependence of junction formation probability and of relative stretching distances of the catechol- and quinone-type molecular junctions is observed. Substitution of the central catechol ring with alkoxy-moieties and the combination with a topological analysis of possible π-electron pathways through the respective molecular skeletons lead to a working hypothesis, which could rationalize the experimentally observed conductance characteristics of the redox-active nanojunctions.
Resumo:
We describe a system for performing SLA-driven management and orchestration of distributed infrastructures composed of services supporting mobile computing use cases. In particular, we focus on a Follow-Me Cloud scenario in which we consider mobile users accessing cloud-enable services. We combine a SLA-driven approach to infrastructure optimization, with forecast-based performance degradation preventive actions and pattern detection for supporting mobile cloud infrastructure management. We present our system's information model and architecture including the algorithmic support and the proposed scenarios for system evaluation.
Resumo:
"Medicine: Perspectives in History and Art" (Robert E. Greenspan) Eight Practical Lessons from Osler That Will Better Your Life (Bryan Boutwell) History of the American Mental Hospital: From networking to not working & Back (Ed Fann) Ambiguities and Amputations: Methods, mishaps, and the surgical quest to cure breast cancer (Student Essay Contest Winner) (Matt Luedke) An Automated, Algorithmic, Retrospective Analysis of the Growing Influence of Statistics in Medicine (Student Essay Contest Winner) (Ryan Rochat) What’s Special about William Osler? (Charles S. Bryan) The Virtuous Physician: Lessons from Medical Biography (Charles S. Bryan) Legacy: 50 Years of Loving Care – The History of Texas Children’s Hospital, 1954-2004 (Betsy Parish) The Education of a University President: Edgar Odell Lovett of Rice University (John B. Boles) Artists and Illness: The Effect of Illness on an Artist’s Work (David Bybee)
Resumo:
Dynamic systems, especially in real-life applications, are often determined by inter-/intra-variability, uncertainties and time-varying components. Physiological systems are probably the most representative example in which population variability, vital signal measurement noise and uncertain dynamics render their explicit representation and optimization a rather difficult task. Systems characterized by such challenges often require the use of adaptive algorithmic solutions able to perform an iterative structural and/or parametrical update process towards optimized behavior. Adaptive optimization presents the advantages of (i) individualization through learning of basic system characteristics, (ii) ability to follow time-varying dynamics and (iii) low computational cost. In this chapter, the use of online adaptive algorithms is investigated in two basic research areas related to diabetes management: (i) real-time glucose regulation and (ii) real-time prediction of hypo-/hyperglycemia. The applicability of these methods is illustrated through the design and development of an adaptive glucose control algorithm based on reinforcement learning and optimal control and an adaptive, personalized early-warning system for the recognition and alarm generation against hypo- and hyperglycemic events.
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.
Resumo:
The excavation site Reigoldswil is located at 550 m above sea level on the Jura chain hillside in north-western Switzerland. The mountains divide the Rhine valley from an agriculturally rich region. The origin of the village lies in the early medieval time. Until now the skeletons of one cemetery have been morphologically studied. Around 216 individuals were excavated from under the foundation walls of a church and in the open field. They date to the 7/8th up to the 10th century. The striking part is the high amount of subadult (0-18 years) individuals with 58% (n=126). One of these children, an approximately 1.5 year old toddler from the 7th century, was buried in a stone cist. Its bones show morphological traces like porotic lesions of the greater wings of the sphenoidale, the squama, the mandibule and the scapula as new bone formation on both femora and tibiae. These signs could be an indicator for Möller-Barlow disease (Ortner 2003, Brickley and Ives 2008, Stark in press). As scurvy is associated with an insufficient intake of vitamin C, malnutrition must be assumed. A reason might be the geographic location or/and a harsh climat with crop failure and famine the first settler had to face. Besides the morphological diagnose amino acids of the bone collagen have been analyzed (Kramis et. al.). Further examinations, such as radiocarbon dating and stable isotope ratios (C, N, O, S) to specify nutrition, are planned.
Resumo:
The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.
Resumo:
Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.
Resumo:
Lesions consistent with skeletal tuberculosis were found in 13 individuals from an early medieval skeletal sample from Courroux (Switzerland). One case of Pott’s disease as well as lytic lesions in vertebrae and joints, rib lesions, and endocranial new bone formation were identified. Three individuals with lesions and one without were tested for the presence of MTBC aDNA, and in two cases, evidence for MTBC aDNA was detected. Our results suggest the presence of tuberculosis in the analyzed material which is in accordance with other osteological and biomolecular research that reported high prevalence of tuberculosis in medieval skeletons.
Resumo:
The oxygen isotopic composition and Mg/Ca ratios in the skeletons of long-lived coralline algae record ambient seawater temperature over time. Similarly, the carbon isotopic composition in the skeletons record delta(13)C values of ambient seawater dissolved inorganic carbon. Here, we measured delta(13)C in the coralline alga Clathromorphum nereostratum to test the feasibility of reconstructing the intrusion of anthropogenic CO(2) into the northern North Pacific Ocean and Bering Sea. The delta(13)C was measured in the high Mgcalcite skeleton of three C. nereostratum specimens from two islands 500 km apart in the Aleutian archipelago. In the records spanning 1887 to 2003, the average decadal rate of decline in delta(13)C values increased from 0.03% yr(-1) in the 1960s to 0.095% yr(-1) in the 1990s, which was higher than expected due to solely the delta(13)C-Suess effect. Deeper water in this region exhibits higher concentrations of CO(2) and low delta(13)C values. Transport of deeper water into surface water (i.e., upwelling) increases when the Aleutian Low is intensified. We hypothesized that the acceleration of the delta(13)C decline may result from increased upwelling from the 1960s to 1990s, which in turn was driven by increased intensity of the Aleutian Low. Detrended delta(13)C records also varied on 4-7 year and bidecadal timescales supporting an atmospheric teleconnection of tropical climate patterns to the northern North Pacific Ocean and Bering Sea manifested as changes in upwelling.
Resumo:
Monthly delta18O records of 2 coral colonies (Porites cf. lutea and P. cf. nodifera) from different localities (Aqaba and Eilat) from the northern Gulf of Aqaba, Red Sea, were calibrated with recorded sea surface temperatures (SST) between 1988 and 2000. The results show high correlation coefficients between SST and delta18O. Seasonal variations of coral delta18O in both locations could explain 91% of the recorded SST. Different delta18O/SST relations from both colonies and from the same colonies were obtained, indicating that delta18O from coral skeletons were subject to an extension rate effect. Significant delta18O depletions are associated with high extension rates and higher values with low extension rates. The relation between coral skeletal delta18O and extension rate is not linear and can be described by a simple exponential model. An inverse relationship extends over extension rates from 1 to 5 mm/yr, while for more rapidly growing corals and portions of colonies the relation is constant and the extension rate does not appear to have a significant effect. We recommend that delta18O values be obtained from fast-growing corals or from portions in which the isotopic disequilibrium is fairly constant (extension rate >5 mm/yr). The results show that interspecific differences in corals may produce a significant delta18O profile offset between 2 colonies that is independent of environmental and extension-rate effects. We conclude that the rate of skeletal extension and the species of coral involved have an important influence on coral delta18O and must be considered when using delta18O records for paleoclimatic reconstructions.
Resumo:
Heavy and light minerals were examined in 29 samples from Sites 494, 498, 499, 500, and 495 on the Deep Sea Drilling Project Leg 67 Middle America Trench transect; these sites represent lower slope, trench, and oceanic crust environments off Guatemala. All samples are Quaternary except those from Hole 494A (Pliocene) and Hole 498A (Miocene). Heavy-mineral assemblages of the Quaternary sediments are characterized by an immature pyroxene-amphibole suite with small quantities of olivine and epidote. The Miocene sediments yielded an assemblage dominated by epidote and pyroxene but lacking olivine; the absence of olivine is attributed to selective removal of the most unstable components by intrastratal solution. Light-mineral assemblages of all samples are predominantly characterized by volcanic glass and plagioclase feldspar. The feldspar compositions are compatible with andesitic source rocks and frequently exhibit oscillatory zoning. The heavy- and light-mineral associations of these sediments suggest a proximal volcanic source, most probably the Neogene highland volcanic province of Guatemala. Sand-sized components from Site 495 are mainly biogenic skeletons and volcanic glass and, in one instance (Section 495-5-3), euhedral crystals of gypsum.