862 resultados para Integrated user model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This case study introduces our continuous work to enhance the virtual classroom in order to provide faculty and students with an environment open to their needs, compliant with learning standards and, therefore compatible with other e-learning environments, and based on open source software. The result is a modulable, sustainable and interoperable learning environment that can be adapted to different teaching and learning situations by incorporating the LMS integrated tools as well as wikis, blogs, forums and Moodle activities among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents preliminary findings from a research study conducted by the Institute for the Study of Knowledge Management in Education on the role of open educational resources (OER) in transforming pedagogy. Based on a study of art and humanities teachers participating in an OER training network, the study reveals how exposure to OER resources and tools support collaboration among teachers, as well as new conversations about teaching practices. These findings have implications for engaging teachers in adopting new OER use practices, and for how OER can be integrated as a model for innovation in teaching and in resource development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND. Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. RESULTS. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. CONCLUSIONS. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper relaxes the standard I(0) and I(1) assumptions typically stated in the monetary VAR literature by considering a richer framework that encompasses the previous two processes as well as other fractionally integrated possibilities. First, a timevarying multivariate spectrum is estimated for post WWII US data. Then, a structural fractionally integrated VAR (VARFIMA) is fitted to each of the resulting time dependent spectra. In this way, both the coefficients of the VAR and the innovation variances are allowed to evolve freely. The model is employed to analyze inflation persistence and to evaluate the stance of US monetary policy. Our findings indicate a strong decline in the innovation variances during the great disinflation, consistent with the view that the good performance of the economy during the 80’s and 90’s is in part a tale of good luck. However, we also find evidence of a decline in inflation persistence together with a stronger monetary response to inflation during the same period. This last result suggests that the Fed may still play a role in accounting for the observed differences in the US inflation history. Finally, we conclude that previous evidence against drifting coefficients could be an artifact of parameter restriction towards the stationary region. Keywords: monetary policy, inflation persistence, fractional integration, timevarying coefficients, VARFIMA. JEL Classification: E52, C32

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of globalization and free trade agreements, international trade is enormously growing and inevitably putting more pressure on the environment over the last few decades. This has drawn the attention of both environmentalist and economist in response to the ever growing concerns of climate change and urgent need of international action for its mitigation. In this work we aim at analyzing the implication of international trade in terms of CO2 between Spain and its important partners using a multi-regional input-output (MRIO) model. A fully integrated 13 regions MRIO model is constructed to examine the pollution responsibility of Spain both from production and consumption perspectives. The empirical results show that Spain is a net importer of CO2 emissions which is equivalent to 29% of its emission due to production. Even though the leading partner with regard to import values are countries such as Germany, France, Italy and Great Britain, the CO2 embodied due to trade with China takes the largest share. This is mainly due to the importation of energy intensive products from China coupled with Chinese poor energy mix which is dominated by coal-power plant. The largest portion (67%) of the global imported CO2 emissions is due to intermediate demand requirements by production sectors. Products such as Motor vehicles, chemicals, a variety of machineries and equipments, textile and leather products, construction materials are the key imports that drive the emissions due to their production in the respective exporting countries. Being at its peak in 2005, the Construction sector is the most responsible activity behind both domestic and imported emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dialogic learning and interactive groups have proved to be a useful methodological approach appliedin educational situations for lifelong adult learners. The principles of this approach stress theimportance of dialogue and equal participation also when designing the training activities. This paperadopts these principles as the basis for a configurable template that can be integrated in runtimesystems. The template is formulated as a meta-UoL which can be interpreted by IMS Learning Designplayers. This template serves as a guide to flexibly select and edit the activities at runtime (on the fly).The meta-UoL has been used successfully by a practitioner so as to create a real-life example, withpositive and encouraging results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project explores the user costs and benefits of winter road closures. Severe winter weather makes travel unsafe and dramatically increases crash rates. When conditions become unsafe due to winter weather, road closures should allow users to avoid crash costs and eliminate costs associated with rescuing stranded motorists. Therefore, the benefits of road closures are the avoided safety costs. The costs of road closures are the delays that are imposed on motorists and motor carriers who would have made the trip had the road not been closed. This project investigated the costs and benefits of road closures and found that evaluating the benefits and costs is not as simple as it appears. To better understand the costs and benefits of road closures, the project investigates the literature, conducts interviews with shippers and motor carriers, and conducts case studies of road closures to determine what actually occurred on roadways during closures. The project also estimates a statistical model that relates weather severity to crash rates. Although, the statistical model is intended to illustrate the possibility to quantitatively relate measurable and predictable weather conditions to the safety performance of a roadway. In the future, weather conditions such as snow fall intensity, visibility, etc., can be used to make objective measures of the safety performance of a roadway rather than relying on subjective evaluations of field staff. The review of the literature and the interviews clearly illustrate that not all delays (increased travel time) are valued the same. Expected delays (routine delays) are valued at the generalized costs (value of the driver’s time, fuel, insurance, wear and tear on the vehicle, etc.), but unexpected delays are valued much higher because they result in interruption of synchronous activities at the trip’s destination. To reduce the costs of delays resulting from road closures, public agencies should communicate as early as possible the likelihood of a road closure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is an article about the theoretical model for assessing quality in health services proposed by Parasuraman, Zheitaml and Berry, in order to measure the degree of satisfaction of users. This model is based on the analysis of expectations and perceptions of users of health services, by means of five dimensions: tangibility, reliability, responsiveness, assurance and empathy. From the difference between what is expected by the user and the service offered, gaps or shortcomings are derived that may be the main obstacle for users to perceive the provision of such services with quality. It was observed that the use of the psychometric scale called Service Quality (SERVQUAL) in some studies about satisfaction, obtained very favorable results in the institutions in which it was employed. The analysis revealed the need to improve the existing models of evaluation, as well as the importance of measuring user satisfaction in health institutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problems arising in the logistics of commercial distribution are complexand involve several players and decision levels. One important decision isrelated with the design of the routes to distribute the products, in anefficient and inexpensive way.This article explores three different distribution strategies: the firststrategy corresponds to the classical vehicle routing problem; the second isa master route strategy with daily adaptations and the third is a strategythat takes into account the cross-functional planning through amulti-objective model with two objectives. All strategies are analyzed ina multi-period scenario. A metaheuristic based on the Iteratetd Local Search,is used to solve the models related with each strategy. A computationalexperiment is performed to evaluate the three strategies with respect to thetwo objectives. The cross functional planning strategy leads to solutions thatput in practice the coordination between functional areas and better meetbusiness objectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SUMMARY: We present a tool designed for visualization of large-scale genetic and genomic data exemplified by results from genome-wide association studies. This software provides an integrated framework to facilitate the interpretation of SNP association studies in genomic context. Gene annotations can be retrieved from Ensembl, linkage disequilibrium data downloaded from HapMap and custom data imported in BED or WIG format. AssociationViewer integrates functionalities that enable the aggregation or intersection of data tracks. It implements an efficient cache system and allows the display of several, very large-scale genomic datasets. AVAILABILITY: The Java code for AssociationViewer is distributed under the GNU General Public Licence and has been tested on Microsoft Windows XP, MacOSX and GNU/Linux operating systems. It is available from the SourceForge repository. This also includes Java webstart, documentation and example datafiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: When planning SIRT using 90Y microspheres, the partition model is used to refine the activity calculated by the body surface area (BSA) method to potentially improve the safety and efficacy of treatment. For this partition model dosimetry, accurate determination of mean tumor-to-normal liver ratio (TNR) is critical since it directly impacts absorbed dose estimates. This work aimed at developing and assessing a reliable methodology for the calculation of 99mTc-MAA SPECT/CT-derived TNR ratios based on phantom studies. Materials and methods: IQ NEMA (6 hot spheres) and Kyoto liver phantoms with different hot/background activity concentration ratios were imaged on a SPECT/CT (GE Infinia Hawkeye 4). For each reconstruction with the IQ phantom, TNR quantification was assessed in terms of relative recovery coefficients (RC) and image noise was evaluated in terms of coefficient of variation (COV) in the filled background. RCs were compared using OSEM with Hann, Butterworth and Gaussian filters, as well as FBP reconstruction algorithms. Regarding OSEM, RCs were assessed by varying different parameters independently, such as the number of iterations (i) and subsets (s) and the cut-off frequency of the filter (fc). The influence of the attenuation and diffusion corrections was also investigated. Furthermore, both 2D-ROIs and 3D-VOIs contouring were compared. For this purpose, dedicated Matlab© routines were developed in-house for automatic 2D-ROI/3D-VOI determination to reduce intra-user and intra-slice variability. Best reconstruction parameters and RCs obtained with the IQ phantom were used to recover corrected TNR in case of the Kyoto phantom for arbitrary hot-lesion size. In addition, we computed TNR volume histograms to better assess uptake heterogeneityResults: The highest RCs were obtained with OSEM (i=2, s=10) coupled with the Butterworth filter (fc=0.8). Indeed, we observed a global 20% RC improvement over other OSEM settings and a 50% increase as compared to the best FBP reconstruction. In any case, both attenuation and diffusion corrections must be applied, thus improving RC while preserving good image noise (COV<10%). Both 2D-ROI and 3D-VOI analysis lead to similar results. Nevertheless, we recommend using 3D-VOI since tumor uptake regions are intrinsically 3D. RC-corrected TNR values lie within 17% around the true value, substantially improving the evaluation of small volume (<15 mL) regions. Conclusions: This study reports the multi-parameter optimization of 99mTc MAA SPECT/CT images reconstruction in planning 90Y dosimetry for SIRT. In phantoms, accurate quantification of TNR was obtained using OSEM coupled with Butterworth and RC correction.