31 resultados para Space-time analysis
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.
Resumo:
OBJECTIVES The aim of this prospective cohort trial was to perform a cost/time analysis for implant-supported single-unit reconstructions in the digital workflow compared to the conventional pathway. MATERIALS AND METHODS A total of 20 patients were included for rehabilitation with 2 × 20 implant crowns in a crossover study design and treated consecutively each with customized titanium abutments plus CAD/CAM-zirconia-suprastructures (test: digital) and with standardized titanium abutments plus PFM-crowns (control conventional). Starting with prosthetic treatment, analysis was estimated for clinical and laboratory work steps including measure of costs in Swiss Francs (CHF), productivity rates and cost minimization for first-line therapy. Statistical calculations were performed with Wilcoxon signed-rank test. RESULTS Both protocols worked successfully for all test and control reconstructions. Direct treatment costs were significantly lower for the digital workflow 1815.35 CHF compared to the conventional pathway 2119.65 CHF [P = 0.0004]. For subprocess evaluation, total laboratory costs were calculated as 941.95 CHF for the test group and 1245.65 CHF for the control group, respectively [P = 0.003]. The clinical dental productivity rate amounted to 29.64 CHF/min (digital) and 24.37 CHF/min (conventional) [P = 0.002]. Overall, cost minimization analysis exhibited an 18% cost reduction within the digital process. CONCLUSION The digital workflow was more efficient than the established conventional pathway for implant-supported crowns in this investigation.
Resumo:
In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.
Resumo:
The aetiology of childhood cancers remains largely unknown. It has been hypothesized that infections may be involved and that mini-epidemics thereof could result in space-time clustering of incident cases. Most previous studies support spatio-temporal clustering for leukaemia, while results for other diagnostic groups remain mixed. Few studies have corrected for uneven regional population shifts which can lead to spurious detection of clustering. We examined whether there is space-time clustering of childhood cancers in Switzerland identifying cases diagnosed at age <16 years between 1985 and 2010 from the Swiss Childhood Cancer Registry. Knox tests were performed on geocoded residence at birth and diagnosis separately for leukaemia, acute lymphoid leukaemia (ALL), lymphomas, tumours of the central nervous system, neuroblastomas and soft tissue sarcomas. We used Baker's Max statistic to correct for multiple testing and randomly sampled time-, sex- and age-matched controls from the resident population to correct for uneven regional population shifts. We observed space-time clustering of childhood leukaemia at birth (Baker's Max p = 0.045) but not at diagnosis (p = 0.98). Clustering was strongest for a spatial lag of <1 km and a temporal lag of <2 years (Observed/expected close pairs: 124/98; p Knox test = 0.003). A similar clustering pattern was observed for ALL though overall evidence was weaker (Baker's Max p = 0.13). Little evidence of clustering was found for other diagnostic groups (p > 0.2). Our study suggests that childhood leukaemia tends to cluster in space-time due to an etiologic factor present in early life.
Resumo:
SUMMARY There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.
Resumo:
We construct the theory of dissipative hydrodynamics of uncharged fluids living on embedded space-time surfaces to first order in a derivative expansion in the case of codimension-1 surfaces (including fluid membranes) and the theory of non-dissipative hydrodynamics to second order in a derivative expansion in the case of codimension higher than one under the assumption of no angular momenta in transverse directions to the surface. This construction includes the elastic degrees of freedom, and hence the corresponding transport coefficients, that take into account transverse fluctuations of the geometry where the fluid lives. Requiring the second law of thermodynamics to be satisfied leads us to conclude that in the case of codimension-1 surfaces the stress-energy tensor is characterized by 2 hydrodynamic and 1 elastic independent transport coefficient to first order in the expansion while for codimension higher than one, and for non-dissipative flows, the stress-energy tensor is characterized by 7 hydrodynamic and 3 elastic independent transport coefficients to second order in the expansion. Furthermore, the constraints imposed between the stress-energy tensor, the bending moment and the entropy current of the fluid by these extra non-dissipative contributions are fully captured by equilibrium partition functions. This analysis constrains the Young modulus which can be measured from gravity by elastically perturbing black branes.
Resumo:
We investigate parallel algorithms for the solution of the Navier–Stokes equations in space-time. For periodic solutions, the discretized problem can be written as a large non-linear system of equations. This system of equations is solved by a Newton iteration. The Newton correction is computed using a preconditioned GMRES solver. The parallel performance of the algorithm is illustrated.
Resumo:
Advances in food transformation have dramatically increased the diversity of products on the market and, consequently, exposed consumers to a complex spectrum of bioactive nutrients whose potential risks and benefits have mostly not been confidently demonstrated. Therefore, tools are needed to efficiently screen products for selected physiological properties before they enter the market. NutriChip is an interdisciplinary modular project funded by the Swiss programme Nano-Tera, which groups scientists from several areas of research with the aim of developing analytical strategies that will enable functional screening of foods. The project focuses on postprandial inflammatory stress, which potentially contributes to the development of chronic inflammatory diseases. The first module of the NutriChip project is composed of three in vitro biochemical steps that mimic the digestion process, intestinal absorption, and subsequent modulation of immune cells by the bioavailable nutrients. The second module is a miniaturised form of the first module (gut-on-a-chip) that integrates a microfluidic-based cell co-culture system and super-resolution imaging technologies to provide a physiologically relevant fluid flow environment and allows sensitive real-time analysis of the products screened in vitro. The third module aims at validating the in vitro screening model by assessing the nutritional properties of selected food products in humans. Because of the immunomodulatory properties of milk as well as its amenability to technological transformation, dairy products have been selected as model foods. The NutriChip project reflects the opening of food and nutrition sciences to state-of-the-art technologies, a key step in the translation of transdisciplinary knowledge into nutritional advice.
Resumo:
Current concepts of synaptic fine-structure are derived from electron microscopic studies of tissue fixed by chemical fixation using aldehydes. However, chemical fixation with glutaraldehyde and paraformaldehyde and subsequent dehydration in ethanol result in uncontrolled tissue shrinkage. While electron microscopy allows for the unequivocal identification of synaptic contacts, it cannot be used for real-time analysis of structural changes at synapses. For the latter purpose advanced fluorescence microscopy techniques are to be applied which, however, do not allow for the identification of synaptic contacts. Here, two approaches are described that may overcome, at least in part, some of these drawbacks in the study of synapses. By focusing on a characteristic, easily identifiable synapse, the mossy fiber synapse in the hippocampus, we first describe high-pressure freezing of fresh tissue as a method that may be applied to study subtle changes in synaptic ultrastructure associated with functional synaptic plasticity. Next, we propose to label presynaptic mossy fiber terminals and postsynaptic complex spines on CA3 pyramidal neurons by different fluorescent dyes to allow for the real-time monitoring of these synapses in living tissue over extended periods of time. We expect these approaches to lead to new insights into the structure and function of central synapses.
Resumo:
We present the first-order corrected dynamics of fluid branes carrying higher-form charge by obtaining the general form of their equations of motion to pole-dipole order in the absence of external forces. Assuming linear response theory, we characterize the corresponding effective theory of stationary bent charged (an)isotropic fluid branes in terms of two sets of response coefficients, the Young modulus and the piezoelectric moduli. We subsequently find large classes of examples in gravity of this effective theory, by constructing stationary strained charged black brane solutions to first order in a derivative expansion. Using solution generating techniques and bent neutral black branes as a seed solution, we obtain a class of charged black brane geometries carrying smeared Maxwell charge in Einstein-Maxwell-dilaton gravity. In the specific case of ten-dimensional space-time we furthermore use T-duality to generate bent black branes with higher-form charge, including smeared D-branes of type II string theory. By subsequently measuring the bending moment and the electric dipole moment which these geometries acquire due to the strain, we uncover that their form is captured by classical electroelasticity theory. In particular, we find that the Young modulus and the piezoelectric moduli of our strained charged black brane solutions are parameterized by a total of 4 response coefficients, both for the isotropic as well as anisotropic cases.
Resumo:
Hydrodynamics can be consistently formulated on surfaces of arbitrary co-dimension in a background space-time, providing the effective theory describing long-wavelength perturbations of black branes. When the co-dimension is non-zero, the system acquires fluid-elastic properties and constitutes what is called a fluid brane. Applying an effective action approach, the most general form of the free energy quadratic in the extrinsic curvature and extrinsic twist potential of stationary fluid brane configurations is constructed to second order in a derivative expansion. This construction generalizes the Helfrich-Canham bending energy for fluid membranes studied in theoretical biology to the case in which the fluid is rotating. It is found that stationary fluid brane configurations are characterized by a set of 3 elastic response coefficients, 3 hydrodynamic response coefficients and 1 spin response coefficient for co-dimension greater than one. Moreover, the elastic degrees of freedom present in the system are coupled to the hydrodynamic degrees of freedom. For co-dimension-1 surfaces we find a 8 independent parameter family of stationary fluid branes. It is further shown that elastic and spin corrections to (non)-extremal brane effective actions can be accounted for by a multipole expansion of the stress-energy tensor, therefore establishing a relation between the different formalisms of Carter, Capovilla-Guven and Vasilic-Vojinovic and between gravity and the effective description of stationary fluid branes. Finally, it is shown that the Young modulus found in the literature for black branes falls into the class predicted by this approach - a relation which is then used to make a proposal for the second order effective action of stationary blackfolds and to find the corrected horizon angular velocity of thin black rings.
Resumo:
In the course of language acquisition learners have to deal with the task of producing narrative texts that are coherent across a range of conceptual domains (space, time, entities) -- both within as well as across utterances. The organization of information is analyzed in this study, on the basis of retellings of a silent film, in terms of devices used in the coordination and subordination of events within the narrative sequence. The focus on subordination reflects a core grammatical difference between Italian and French, as Italian is a null-subject language while French is not. The implications of this contrast for information structure include differences in topic management within the sequence of events. The present study investigates in how far Italian-French bilingual speakers acquire the patterns of monolingual speakers of Italian. It compares how early and late bilinguals of these two languages proceed when linking information in narratives in Italian.
Resumo:
We propose a method that robustly combines color and feature buffers to denoise Monte Carlo renderings. On one hand, feature buffers, such as per pixel normals, textures, or depth, are effective in determining denoising filters because features are highly correlated with rendered images. Filters based solely on features, however, are prone to blurring image details that are not well represented by the features. On the other hand, color buffers represent all details, but they may be less effective to determine filters because they are contaminated by the noise that is supposed to be removed. We propose to obtain filters using a combination of color and feature buffers in an NL-means and cross-bilateral filtering framework. We determine a robust weighting of colors and features using a SURE-based error estimate. We show significant improvements in subjective and quantitative errors compared to the previous state-of-the-art. We also demonstrate adaptive sampling and space-time filtering for animations.