942 resultados para Data Driven Modeling
Resumo:
The field of library assessment continues to grow. The annual Library Assessment Trends Report provides a brief synopsis of the more important trends in library assessment. It is hoped these brief reports will facilitate the Dean of the Library’s understanding of assessment trends. These reports provide information that supports data driven decisions. Additionally, the reports are an outreach method that supports a greater institutional understanding of library assessment. Library assessment supports strategic planning, improved processes, and a greater understanding of our users’ needs.
Resumo:
Similar to other health care processes, referrals are susceptible to breakdowns. These breakdowns in the referral process can lead to poor continuity of care, slow diagnostic processes, delays and repetition of tests, patient and provider dissatisfaction, and can lead to a loss of confidence in providers. These facts and the necessity for a deeper understanding of referrals in healthcare served as the motivation to conduct a comprehensive study of referrals. The research began with the real problem and need to understand referral communication as a mean to improve patient care. Despite previous efforts to explain referrals and the dynamics and interrelations of the variables that influence referrals there is not a common, contemporary, and accepted definition of what a referral is in the health care context. The research agenda was guided by the need to explore referrals as an abstract concept by: 1) developing a conceptual definition of referrals, and 2) developing a model of referrals, to finally propose a 3) comprehensive research framework. This dissertation has resulted in a standard conceptual definition of referrals and a model of referrals. In addition a mixed-method framework to evaluate referrals was proposed, and finally a data driven model was developed to predict whether a referral would be approved or denied by a specialty service. The three manuscripts included in this dissertation present the basis for studying and assessing referrals using a common framework that should allow an easier comparative research agenda to improve referrals taking into account the context where referrals occur.
Resumo:
Methane is an important greenhouse gas, responsible for about 20 of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios � which differ in fossil fuel and microbial emissions � to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain.
Resumo:
The paper argues for a distinction between sensory-and conceptual-information storage in the human information-processing system. Conceptual information is characterized as meaningful and symbolic, while sensory information may exist in modality-bound form. Furthermore, it is assumed that sensory information does not contribute to conscious remembering and can be used only in data-driven process reptitions, which can be accompanied by a kind of vague or intuitive feeling. Accordingly, pure top-down and willingly controlled processing, such as free recall, should not have any access to sensory data. Empirical results from different research areas and from two experiments conducted by the authors are presented in this article to support these theoretical distinctions. The experiments were designed to separate a sensory-motor and a conceptual component in memory for two-digit numbers and two-letter items, when parts of the numbers or items were imaged or drawn on a tablet. The results of free recall and recognition are discussed in a theoretical framework which distinguishes sensory and conceptual information in memory.
Resumo:
The ATLAS experiment at the LHC has measured the production cross section of events with two isolated photons in the final state, in proton-proton collisions at root s = 7 TeV. The full data set collected in 2011, corresponding to an integrated luminosity of 4.9 fb(-1), is used. The amount of background, from hadronic jets and isolated electrons, is estimated with data-driven techniques and subtracted. The total cross section, for two isolated photons with transverse energies above 25 GeV and 22 GeV respectively, in the acceptance of the electromagnetic calorimeter (vertical bar eta vertical bar < 1.37 and 1.52 < vertical bar eta vertical bar 2.37) and with an angular separation Delta R > 0.4, is 44.0(-4.2)(+3.2) pb. The differential cross sections as a function of the di-photon invariant mass, transverse momentum, azimuthal separation, and cosine of the polar angle of the largest transverse energy photon in the Collins-Soper di-photon rest frame are also measured. The results are compared to the prediction of leading-order parton-shower and next-to-leading-order and next-to-next-to-leading-order parton-level generators.
Resumo:
While most healthy elderly are able to manage their everyday activities, studies showed that there are both stable and declining abilities during healthy aging. For example, there is evidence that semantic memory processes which involve controlled retrieval mechanism decrease, whereas the automatic functioning of the semantic network remains intact. In contrast, patients with Alzheimer’s disease (AD) suffer from episodic and semantic memory impairments aggravating their daily functioning. In AD, severe episodic as well as semantic memory deficits are observable. While the hallmark symptom of episodic memory decline in AD is well investigated, the underlying mechanisms of semantic memory deterioration remain unclear. By disentangling the semantic memory impairments in AD, the present thesis aimed to improve early diagnosis and to find a biomarker for dementia. To this end, a study on healthy aging and a study with dementia patients were conducted investigating automatic and controlled semantic word retrieval. Besides the inclusion of AD patients, a group of participants diagnosed with semantic dementia (SD) – showing isolated semantic memory loss – was assessed. Automatic and controlled semantic word retrieval was measured with standard neuropsychological tests and by means of event-related potentials (ERP) recorded during the performance of a semantic priming (SP) paradigm. Special focus was directed to the N400 or N400-LPC (late positive component) complex, an ERP that is sensitive to the semantic word retrieval. In both studies, data driven topographical analyses were applied. Furthermore, in the patient study, the combination of the individual baseline cerebral blood flow (CBF) with the N400 topography of each participant was employed in order to relate altered functional electrophysiology to the pathophysiology of dementia. Results of the aging study revealed that the automatic semantic word retrieval remains stable during healthy aging, the N400-LPC complex showed a comparable topography in contrast to the young participants. Both patient groups showed automatic SP to some extent, but strikingly the ERP topographies were altered compared to healthy controls. Most importantly, the N400 was identified as a putative marker for dementia. In particular, the degree of the topographical N400 similarity was demonstrated to separate healthy elderly from demented patients. Furthermore, the marker was significantly related to baseline CBF reduction in brain areas relevant for semantic word retrieval. Summing up, the first major finding of the present thesis was that all groups showed semantic priming, but that the N400 topography differed significantly between healthy and demented elderly. The second major contribution was the identification of the N400 similarity as a putative marker for dementia. To conclude, the present thesis added evidence of preserved automatic processing during healthy aging. Moreover, a possible marker which might contribute to an improved diagnosis and lead consequently to a more effective treatment of dementia was presented and has to be further developed.
Resumo:
The largest uncertainties in the Standard Model calculation of the anomalous magnetic moment of the muon (ɡ − 2)μ come from hadronic contributions. In particular, it can be expected that in a few years the subleading hadronic light-by-light (HLbL) contribution will dominate the theory uncertainty. We present a dispersive description of the HLbL tensor. This new, model-independent approach opens up an avenue towards a data-driven determination of the HLbL contribution to the (ɡ − 2)μ.
Resumo:
In this paper we make a further step towards a dispersive description of the hadronic light-by-light (HLbL) tensor, which should ultimately lead to a data-driven evaluation of its contribution to (g − 2) μ . We first provide a Lorentz decomposition of the HLbL tensor performed according to the general recipe by Bardeen, Tung, and Tarrach, generalizing and extending our previous approach, which was constructed in terms of a basis of helicity amplitudes. Such a tensor decomposition has several advantages: the role of gauge invariance and crossing symmetry becomes fully transparent; the scalar coefficient functions are free of kinematic singularities and zeros, and thus fulfill a Mandelstam double-dispersive representation; and the explicit relation for the HLbL contribution to (g − 2) μ in terms of the coefficient functions simplifies substantially. We demonstrate explicitly that the dispersive approach defines both the pion-pole and the pion-loop contribution unambiguously and in a model-independent way. The pion loop, dispersively defined as pion-box topology, is proven to coincide exactly with the one-loop scalar QED amplitude, multiplied by the appropriate pion vector form factors.
Resumo:
The largest uncertainties in the Standard Model calculation of the anomalous magnetic moment of the muon (g − 2)μ come from hadronic contributions. In particular, it can be expected that in a few years the subleading hadronic light-by-light (HLbL) contribution will dominate the theory uncertainty. We present a dispersive description of the HLbL tensor, which is based on unitarity, analyticity, crossing symmetry, and gauge invariance. Such a model-independent Approach opens up an avenue towards a data-driven determination of the HLbL contribution to the (g − 2)μ.
Resumo:
A crucial link in preserving and protecting the future of our communities resides in maintaining the health and well being of our youth. While every member of the community owns an opinion regarding where to best utilize monies for prevention and intervention, the data to support such opinion is often scarce. In an effort to generate data-driven indices for community planning and action, the United Way of Comal County, Texas partnered with the University Of Texas - Houston Health Science Center, School Of Public Health to accomplish a county-specific needs assessment. A community-based participatory research emphasis utilizing the Mobilization for Action through Planning and Partnership (MAPP) format developed by the National Association of City and County Health Officials (NACCHO) was implemented to engage community members in identifying and addressing community priorities. The single greatest area of consensus and concern identified by community members was the health and well being of the youth population. Thus, a youth survey, targeting these specific areas of community concern, was designed, coordinated and administered to all 9-11th grade students in the county. 20% of the 3,698 completed surveys (72% response rate) were randomly selected for analysis. These 740 surveys were coded and scanned into an electronic survey database. Statistical analysis provided youth-reported data on the status of the multiple issues affecting the health and well being of the community's youth. These data will be reported back to the community stakeholders, as part of the larger Comal County Needs Assessment, for the purposes of community planning and action. Survey data will provide community planners with an awareness of the high risk behaviors and habit patterns amongst their youth. This knowledge will permit more effective targeting of the means for encouraging healthy behaviors and preventing the spread of disease. Further, the community-oriented, population-based nature of this effort will provide answers to questions raised by the community and will provide an effective launching pad for the development and implementation of targeted, preventive health strategies. ^
Resumo:
Temperature-dependent population growth of diamondback moth (DBM) Plutella xylostella (L.), a prolific insect pest of crucifer vegetables, was studied under six constant temperatures in the laboratory. The objective of the study was to predict the impacts of temperature changes on the population of DBM at high-resolution scales along altitudinal gradients and under climate change scenarios. Non-linear functions were fitted on the data for modeling the development, mortality, longevity and oviposition of the pest. The best-fitted functions for each life stage were compiled for estimating the life table parameters of the species by stochastic simulations. To quantify the impacts on the pest, three indices (establishment, generation and activity) were computed using the estimates of life table parameters and temperature data obtained at local scale (current scenario 2013) and downscaled climate change data (future scenario 2055) from the AFRICLIM database. To measure and represent the impacts of temperature change along the altitude on the pest; the indices were mapped along the altitudinal gradients of Kilimanjaro and Taita Hills, in Tanzania and Kenya, respectively. Potential impact of the changes between climate scenarios 2013 and 2055 was assessed. The data files included in this database were utilized for the above analysis to develop temperature dependent phenology of Plutella xylostella to assess current and future distribution along eastern African Afromontanes.
Resumo:
Coral reefs represent major accumulations of calcium carbonate (CaCO3). The particularly labyrinthine network of reefs in Torres Strait, north of the Great Barrier Reef (GBR), has been examined in order to estimate their gross CaCO3 productivity. The approach involved a two-step procedure, first characterising and classifying the morphology of reefs based on a classification scheme widely employed on the GBR and then estimating gross CaCO3 productivity rates across the region using a regional census-based approach. This was undertaken by independently verifying published rates of coral reef community gross production for use in Torres Strait, based on site-specific ecological and morphological data. A total of 606 reef platforms were mapped and classified using classification trees. Despite the complexity of the maze of reefs in Torres Strait, there are broad morphological similarities with reefs in the GBR. The spatial distribution and dimensions of reef types across both regions are underpinned by similar geological processes, sea-level history in the Holocene and exposure to the same wind/wave energetic regime, resulting in comparable geomorphic zonation. However, the presence of strong tidal currents flowing through Torres Strait and the relatively shallow and narrow dimensions of the shelf exert a control on local morphology and spatial distribution of the reef platforms. A total amount of 8.7 million tonnes of CaCO3 per year, at an average rate of 3.7 kg CaCO3 m-2 yr-1 (G), were estimated for the studied area. Extrapolated production rates based on detailed and regional census-based approaches for geomorphic zones across Torres Strait were comparable to those reported elsewhere, particularly values for the GBR based on alkalinity-reduction methods. However, differences in mapping methodologies and the impact of reduced calcification due to global trends in coral reef ecological decline and changing oceanic physical conditions warrant further research. The novel method proposed in this study to characterise the geomorphology of reef types based on classification trees provides an objective and repeatable data-driven approach that combined with regional census-based approaches has the potential to be adapted and transferred to different coral reef regions, depicting a more accurate picture of interactions between reef ecology and geomorphology.
Resumo:
We introduce two probabilistic, data-driven models that predict a ship's speed and the situations where a ship is probable to get stuck in ice based on the joint effect of ice features such as the thickness and concentration of level ice, ice ridges, rafted ice, moreover ice compression is considered. To develop the models to datasets were utilized. First, the data from the Automatic Identification System about the performance of a selected ship was used. Second, a numerical ice model HELMI, developed in the Finnish Meteorological Institute, provided information about the ice field. The relations between the ice conditions and ship movements were established using Bayesian learning algorithms. The case study presented in this paper considers a single and unassisted trip of an ice-strengthened bulk carrier between two Finnish ports in the presence of challenging ice conditions, which varied in time and space. The obtained results show good prediction power of the models. This means, on average 80% for predicting the ship's speed within specified bins, and above 90% for predicting cases where a ship may get stuck in ice. We expect this new approach to facilitate the safe and effective route selection problem for ice-covered waters where the ship performance is reflected in the objective function.
Resumo:
The 87Sr/86Sr ratios and Sr concentrations in sediment and pore fluids are used to evaluate the rates of calcite recrystallization at ODP Site 807A on the Ontong Java Plateau, an 800-meter thick section of carbonate ooze and chalk. A numerical model is used to evaluate the pore fluid chemistry and Sr isotopes in an accumulating section. The deduced calcite recrystallization rate is 2% per million years (%/Myr) near the top of the section and decreases systematically in older parts of the section such that the rate is close to 0.1/age (in years). The deduced recrystallization rates have important implications for the interpretation of Ca and Mg concentration profiles in the pore fluids. The effect of calcite recrystallization on pore fluid chemistry is described by the reaction length, L, which varies by element, and depends on the concentration in pore fluid and solid. When L is small compared to the thickness of the sedimentary section, the pore fluid concentration is controlled by equilibrium or steady-state exchange with the solid phase, except within a distance L of the sediment-water interface. When L is large relative to the thickness of sediment, the pore fluid concentration is mostly controlled by the boundary conditions and diffusion. The values of L for Ca, Sr, and Mg are of order 15, 150, and 1500 meters, respectively. L_Sr is derived from isotopic data and modeling, and allows us to infer the values of L_Ca and L_Mg. The small value for L_Ca indicates that pore fluid Ca concentrations, which gradually increase down section, must be equilibrium values that are maintained by solution-precipitation exchange with calcite and do not reflect Ca sources within or below the sediment column. The pore fluid Ca measurements and measured alkalinity allow us to calculate the in situ pH in the pore fluids, which decreases from 7.6 near the sediment-water interface to 7.1+/-0.1 at 400-800 mbsf. While the calculated pH values are in agreement with some of the values measured during ODP Leg 130, most of the measurements are artifacts. The large value for L_Mg indicates that the pore fluid Mg concentrations at 807A are not controlled by calcite-fluid equilibrium but instead are determined by the changing Mg concentration of seawater during deposition, modified by aqueous diffusion in the pore fluids. We use the pore fluid Mg concentration profile at Site 807A to retrieve a global record for seawater Mg over the past 35 Myr, which shows that seawater Mg has increased rapidly over the past 10 Myr, rather than gradually over the past 60 Myr. This observation suggests that the Cenozoic rise in seawater Mg is controlled by continental weathering inputs rather than by exchange with oceanic crust. The relationship determined between reaction rate and age in silicates and carbonates is strikingly similar, which suggests that reaction affinity is not the primary determinant of silicate dissolution rates in nature.
Resumo:
The analysis of research data plays a key role in data-driven areas of science. Varieties of mixed research data sets exist and scientists aim to derive or validate hypotheses to find undiscovered knowledge. Many analysis techniques identify relations of an entire dataset only. This may level the characteristic behavior of different subgroups in the data. Like automatic subspace clustering, we aim at identifying interesting subgroups and attribute sets. We present a visual-interactive system that supports scientists to explore interesting relations between aggregated bins of multivariate attributes in mixed data sets. The abstraction of data to bins enables the application of statistical dependency tests as the measure of interestingness. An overview matrix view shows all attributes, ranked with respect to the interestingness of bins. Complementary, a node-link view reveals multivariate bin relations by positioning dependent bins close to each other. The system supports information drill-down based on both expert knowledge and algorithmic support. Finally, visual-interactive subset clustering assigns multivariate bin relations to groups. A list-based cluster result representation enables the scientist to communicate multivariate findings at a glance. We demonstrate the applicability of the system with two case studies from the earth observation domain and the prostate cancer research domain. In both cases, the system enabled us to identify the most interesting multivariate bin relations, to validate already published results, and, moreover, to discover unexpected relations.