895 resultados para Customer-value based approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the understanding and representation of the impacts of volcanic eruptions on climate have improved in the last decades, uncertainties in the stratospheric aerosol forcing from large eruptions are now linked not only to visible optical depth estimates on a global scale but also to details on the size, latitude and altitude distributions of the stratospheric aerosols. Based on our understanding of these uncertainties, we propose a new model-based approach to generating a volcanic forcing for general circulation model (GCM) and chemistry–climate model (CCM) simulations. This new volcanic forcing, covering the 1600–present period, uses an aerosol microphysical model to provide a realistic, physically consistent treatment of the stratospheric sulfate aerosols. Twenty-six eruptions were modeled individually using the latest available ice cores aerosol mass estimates and historical data on the latitude and date of eruptions. The evolution of aerosol spatial and size distribution after the sulfur dioxide discharge are hence characterized for each volcanic eruption. Large variations are seen in hemispheric partitioning and size distributions in relation to location/date of eruptions and injected SO2 masses. Results for recent eruptions show reasonable agreement with observations. By providing these new estimates of spatial distributions of shortwave and long-wave radiative perturbations, this volcanic forcing may help to better constrain the climate model responses to volcanic eruptions in the 1600–present period. The final data set consists of 3-D values (with constant longitude) of spectrally resolved extinction coefficients, single scattering albedos and asymmetry factors calculated for different wavelength bands upon request. Surface area densities for heterogeneous chemistry are also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical appearance models have recently been introduced in bone mechanics to investigate bone geometry and mechanical properties in population studies. The establishment of accurate anatomical correspondences is a critical aspect for the construction of reliable models. Depending on the representation of a bone as an image or a mesh, correspondences are detected using image registration or mesh morphing. The objective of this study was to compare image-based and mesh-based statistical appearance models of the femur for finite element (FE) simulations. To this aim, (i) we compared correspondence detection methods on bone surface and in bone volume; (ii) we created an image-based and a mesh-based statistical appearance models from 130 images, which we validated using compactness, representation and generalization, and we analyzed the FE results on 50 recreated bones vs. original bones; (iii) we created 1000 new instances, and we compared the quality of the FE meshes. Results showed that the image-based approach was more accurate in volume correspondence detection and quality of FE meshes, whereas the mesh-based approach was more accurate for surface correspondence detection and model compactness. Based on our results, we recommend the use of image-based statistical appearance models for FE simulations of the femur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract: Research on human values within the family focuses on value congruence between the family members (Knafo & Schwartz, 2004), based on the assumption that transmission of values is part of a child’s socialization process. Within the family, values are not only implicitly transmitted through this process but also explicitly conveyed through the educational goals of parents (Grusec et al., 2000; Knafo & Schwartz, 2003; 2004, 2009). However, there is a lack of empirical evidence on the role of family characteristics in the value transmission process, especially for families with young children. Thus, the study presented had multiple aims: Firstly, it analyzed the congruency between mothers’ and fathers’ values and their value-based educational goals. Secondly, it examined the influence of mothers’ and fathers’ socio-demographic characteristics on their educational goals. Thirdly, it analyzed the differences in parental educational goals in families with daughters and families with sons. Finally, it examined the congruency between children’s values and the value-based educational goals of their parents. The value transmission process within families with young children was analyzed using data from complete families (child, mother and father) in Switzerland (N = 265). The survey of children consisted of 139 boys and 126 girls aged between 7 and 9 years. Parents’ values and parental educational goals were assessed using the Portrait Value Questionnaire (PVQ-21) (Schwartz, 2005). Children’s’ values were assessed using the Picture-Based Value Survey for Children (PBVS-C) (Döring et al., 2010). Regarding the role of the family context in the process of shaping children’s values, the results of the study show that, on average, parents are similar not only with respect to their value profiles but also with regard to their notion as to which values they would like to transmit to their children. Our findings also suggest that children’s values at an early age are shaped more strongly by mothers’ values than by fathers’ values. Moreover, our results show differences in value transmission with respect to the child’s gender. In particular, they suggest that value transmission within the family has a greater influence on female than on male offspring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lake water temperature (LWT) is an important driver of lake ecosystems and it has been identified as an indicator of climate change. Consequently, the Global Climate Observing System (GCOS) lists LWT as an essential climate variable. Although for some European lakes long in situ time series of LWT do exist, many lakes are not observed or only on a non-regular basis making these observations insufficient for climate monitoring. Satellite data can provide the information needed. However, only few satellite sensors offer the possibility to analyse time series which cover 25 years or more. The Advanced Very High Resolution Radiometer (AVHRR) is among these and has been flown as a heritage instrument for almost 35 years. It will be carried on for at least ten more years, offering a unique opportunity for satellite-based climate studies. Herein we present a satellite-based lake surface water temperature (LSWT) data set for European water bodies in or near the Alps based on the extensive AVHRR 1 km data record (1989–2013) of the Remote Sensing Research Group at the University of Bern. It has been compiled out of AVHRR/2 (NOAA-07, -09, -11, -14) and AVHRR/3 (NOAA-16, -17, -18, -19 and MetOp-A) data. The high accuracy needed for climate related studies requires careful pre-processing and consideration of the atmospheric state. The LSWT retrieval is based on a simulation-based scheme making use of the Radiative Transfer for TOVS (RTTOV) Version 10 together with ERA-interim reanalysis data from the European Centre for Medium-range Weather Forecasts. The resulting LSWTs were extensively compared with in situ measurements from lakes with various sizes between 14 and 580 km2 and the resulting biases and RMSEs were found to be within the range of −0.5 to 0.6 K and 1.0 to 1.6 K, respectively. The upper limits of the reported errors could be rather attributed to uncertainties in the data comparison between in situ and satellite observations than inaccuracies of the satellite retrieval. An inter-comparison with the standard Moderate-resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature product exhibits RMSEs and biases in the range of 0.6 to 0.9 and −0.5 to 0.2 K, respectively. The cross-platform consistency of the retrieval was found to be within ~ 0.3 K. For one lake, the satellite-derived trend was compared with the trend of in situ measurements and both were found to be similar. Thus, orbital drift is not causing artificial temperature trends in the data set. A comparison with LSWT derived through global sea surface temperature (SST) algorithms shows lower RMSEs and biases for the simulation-based approach. A running project will apply the developed method to retrieve LSWT for all of Europe to derive the climate signal of the last 30 years. The data are available at doi:10.1594/PANGAEA.831007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we develop an adaptive framework for Monte Carlo rendering, and more specifically for Monte Carlo Path Tracing (MCPT) and its derivatives. MCPT is attractive because it can handle a wide variety of light transport effects, such as depth of field, motion blur, indirect illumination, participating media, and others, in an elegant and unified framework. However, MCPT is a sampling-based approach, and is only guaranteed to converge in the limit, as the sampling rate grows to infinity. At finite sampling rates, MCPT renderings are often plagued by noise artifacts that can be visually distracting. The adaptive framework developed in this thesis leverages two core strategies to address noise artifacts in renderings: adaptive sampling and adaptive reconstruction. Adaptive sampling consists in increasing the sampling rate on a per pixel basis, to ensure that each pixel value is below a predefined error threshold. Adaptive reconstruction leverages the available samples on a per pixel basis, in an attempt to have an optimal trade-off between minimizing the residual noise artifacts and preserving the edges in the image. In our framework, we greedily minimize the relative Mean Squared Error (rMSE) of the rendering by iterating over sampling and reconstruction steps. Given an initial set of samples, the reconstruction step aims at producing the rendering with the lowest rMSE on a per pixel basis, and the next sampling step then further reduces the rMSE by distributing additional samples according to the magnitude of the residual rMSE of the reconstruction. This iterative approach tightly couples the adaptive sampling and adaptive reconstruction strategies, by ensuring that we only sample densely regions of the image where adaptive reconstruction cannot properly resolve the noise. In a first implementation of our framework, we demonstrate the usefulness of our greedy error minimization using a simple reconstruction scheme leveraging a filterbank of isotropic Gaussian filters. In a second implementation, we integrate a powerful edge aware filter that can adapt to the anisotropy of the image. Finally, in a third implementation, we leverage auxiliary feature buffers that encode scene information (such as surface normals, position, or texture), to improve the robustness of the reconstruction in the presence of strong noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a model to calculate a quantitative risk score for individual aquaculture sites. The score indicates the risk of the site being infected with a specific fish pathogen (viral haemorrhagic septicaemia virus (VHSV); infectious haematopoietic necrosis virus, Koi herpes virus), and is intended to be used for risk ranking sites to support surveillance for demonstration of zone or member state freedom from these pathogens. The inputs to the model include a range of quantitative and qualitative estimates of risk factors organised into five risk themes (1) Live fish and egg movements; (2) Exposure via water; (3) On-site processing; (4) Short-distance mechanical transmission; (5) Distance-independent mechanical transmission. The calculated risk score for an individual aquaculture site is a value between zero and one and is intended to indicate the risk of a site relative to the risk of other sites (thereby allowing ranking). The model was applied to evaluate 76 rainbow trout farms in 3 countries (42 from England, 32 from Italy and 2 from Switzerland) with the aim to establish their risk of being infected with VHSV. Risk scores for farms in England and Italy showed great variation, clearly enabling ranking. Scores ranged from 0.002 to 0.254 (mean score 0.080) in England and 0.011 to 0.778 (mean of 0.130) for Italy, reflecting the diversity of infection status of farms in these countries. Requirements for broader application of the model are discussed. Cost efficient farm data collection is important to realise the benefits from a risk-based approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean biogeochemical and ecosystem processes are linked by net primary production (NPP) in the ocean's surface layer, where inorganic carbon is fixed by photosynthetic processes. Determinations of NPP are necessarily a function of phytoplankton biomass and its physiological status, but the estimation of these two terms from space has remained an elusive target. Here we present new satellite ocean color observations of phytoplankton carbon (C) and chlorophyll (Chl) biomass and show that derived Chl:C ratios closely follow anticipated physiological dependencies on light, nutrients, and temperature. With this new information, global estimates of phytoplankton growth rates (mu) and carbon-based NPP are made for the first time. Compared to an earlier chlorophyll-based approach, our carbon-based values are considerably higher in tropical oceans, show greater seasonality at middle and high latitudes, and illustrate important differences in the formation and demise of regional algal blooms. This fusion of emerging concepts from the phycological and remote sensing disciplines has the potential to fundamentally change how we model and observe carbon cycling in the global oceans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the complex landscape of public education, participants at all levels are searching for policy and practice levers that can raise overall performance and close achievement gaps. The collection of articles in this edition of the Journal of Applied Research on Children takes a big step toward providing the tools and tactics needed for an evidence-based approach to educational policy and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Obama administration's recurring policy emphasis on high-performing charter schools begs the obvious question: how do you identify a high-performing charter school? That is a crucially important policy question because any evaluation strategy that incorrectly identifies charter school performance could have negative effects on the economically and/or academically disadvantaged students who frequently attend charter schools. If low-performing schools are mislabeled and allowed to persist or encouraged to expand, then students may be harmed directly. If high-performing schools are driven from the market by misinformation, then students will lose access to programs and services that can make a difference in their lives. Most of the scholarly analysis to date has focused on comparing the performance of students in charter schools to that of similar students in traditional public schools (TPS). By design, that research measures charter school performance only in relative terms. Charter schools that outperform similarly situated, but low performing, TPSs have positive effects, even if the charter schools are mediocre in an absolute sense. This analysis describes strategies for identifying high-performing charter schools by comparing charter schools with one another. We begin by describing salient characteristics of Texas charter schools. We follow that discussion with a look at how other researchers across the country have compared charter school effectiveness with TPS effectiveness. We then present several metrics that can be used to identify high-performing charter schools. Those metrics are not mutually exclusive—one could easily justify using multiple measures to evaluate school effectiveness—but they are also not equally informative. If the goal is to measure the contributions that schools are making to student knowledge and skills, then a value-added approach like the ones highlighted in this report is clearly superior to a levels-based approach like that taken under the current accountability system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This exploratory, qualitative study examined practitioners' perceptions about family preservation practice. Findings reveal a wide range of identified strengths as well as the limitations of such a model. Interestingly, the most frequently identified strengths were value based rather than practice based in perspective whereas limitations were practice based. Keeping families together was the most common perceived strength but concern about children's safety by keeping the family intact was a frequently reported limitation. Further, lack of support and a lack of theoretical clarity were identified as considerable limitations. Implications suggest these practitioners (mostly child welfare/mental health workers) believe in the approach for the sake of keeping families together but are concerned with endangering the child in the process and recognize the need for theoretical guidance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the impact of China's recent rise on the development of local firms in latecomer developing countries. Based on a detailed analysis of Vietnam's motorcycle industry, the paper argues that China's impact may go beyond what a trade analysis suggests. Indeed, China's rise induced a dynamic transformation in the structure of value chains within Vietnam's motorcycle industry, bringing about far-reaching consequences on the development and upgrading trajectories of local firms. The implications of the case study for the wider "global value chain" approach is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proof carrying code is a general methodology for certifying that the execution of an untrusted mobile code is safe, according to a predefined safety policy. The basic idea is that the code supplier attaches a certifícate (or proof) to the mobile code which, then, the consumer checks in order to ensure that the code is indeed safe. The potential benefit is that the consumer's task is reduced from the level of proving to the level of checking, a much simpler task. Recently, the abstract interpretation techniques developed in logic programming have been proposed as a basis for proof carrying code [1]. To this end, the certifícate is generated from an abstract interpretation-based proof of safety. Intuitively, the verification condition is extracted from a set of assertions guaranteeing safety and the answer table generated during the analysis. Given this information, it is relatively simple and fast to verify that the code does meet this proof and so its execution is safe. This extended abstract reports on experiments which illustrate several issues involved in abstract interpretation-based code certification. First, we describe the implementation of our system in the context of CiaoPP: the preprocessor of the Ciao multi-paradigm (constraint) logic programming system. Then, by means of some experiments, we show how code certification is aided in the implementation of the framework. Finally, we discuss the application of our method within the área of pervasive systems which may lack the necessary computing resources to verify safety on their own. We herein illustrate the relevance of the information inferred by existing cost analysis to control resource usage in this context. Moreover, since the (rather complex) analysis phase is replaced by a simpler, efficient checking process at the code consumer side, we believe that our abstract interpretation-based approach to proof-carrying code becomes practically applicable to this kind of systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sensor networks are increasingly becoming one of the main sources of Big Data on the Web. However, the observations that they produce are made available with heterogeneous schemas, vocabularies and data formats, making it difficult to share and reuse these data for other purposes than those for which they were originally set up. In this thesis we address these challenges, considering how we can transform streaming raw data to rich ontology-based information that is accessible through continuous queries for streaming data. Our main contribution is an ontology-based approach for providing data access and query capabilities to streaming data sources, allowing users to express their needs at a conceptual level, independent of implementation and language-specific details. We introduce novel query rewriting and data translation techniques that rely on mapping definitions relating streaming data models to ontological concepts. Specific contributions include: • The syntax and semantics of the SPARQLStream query language for ontologybased data access, and a query rewriting approach for transforming SPARQLStream queries into streaming algebra expressions. • The design of an ontology-based streaming data access engine that can internally reuse an existing data stream engine, complex event processor or sensor middleware, using R2RML mappings for defining relationships between streaming data models and ontology concepts. Concerning the sensor metadata of such streaming data sources, we have investigated how we can use raw measurements to characterize streaming data, producing enriched data descriptions in terms of ontological models. Our specific contributions are: • A representation of sensor data time series that captures gradient information that is useful to characterize types of sensor data. • A method for classifying sensor data time series and determining the type of data, using data mining techniques, and a method for extracting semantic sensor metadata features from the time series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

onceptual design phase is partially supported by product lifecycle management/computer-aided design (PLM/CAD) systems causing discontinuity of the design information flow: customer needs — functional requirements — key characteristics — design parameters (DPs) — geometric DPs. Aiming to address this issue, it is proposed a knowledge-based approach is proposed to integrate quality function deployment, failure mode and effects analysis, and axiomatic design into a commercial PLM/CAD system. A case study, main subject of this article, was carried out to validate the proposed process, to evaluate, by a pilot development, how the commercial PLM/CAD modules and application programming interface could support the information flow, and based on the pilot scheme results to propose a full development framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By 2050 it is estimated that the number of worldwide Alzheimer?s disease (AD) patients will quadruple from the current number of 36 million people. To date, no single test, prior to postmortem examination, can confirm that a person suffers from AD. Therefore, there is a strong need for accurate and sensitive tools for the early diagnoses of AD. The complex etiology and multiple pathogenesis of AD call for a system-level understanding of the currently available biomarkers and the study of new biomarkers via network-based modeling of heterogeneous data types. In this review, we summarize recent research on the study of AD as a connectivity syndrome. We argue that a network-based approach in biomarker discovery will provide key insights to fully understand the network degeneration hypothesis (disease starts in specific network areas and progressively spreads to connected areas of the initial loci-networks) with a potential impact for early diagnosis and disease-modifying treatments. We introduce a new framework for the quantitative study of biomarkers that can help shorten the transition between academic research and clinical diagnosis in AD.