864 resultados para Information search – models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general perspective of M-technologies and M-Services at the Spanish universities is not still in a very high level when we are ending the first decade of the 21st century. Some Universities and some of their libraries are starting to try out with M-technologies, but are still far from a model of massive exploitation, less than in some other countries. A deep study is needed to know the main reasons, study that we will not do in this paper. This general perspective does not mean that there are no significant initiatives which start to trust in M-technologies from Universities and their libraries. Models based in M-technologies make more sense than ever in open universities and in open libraries. That's the reason why the UOC's Library began in late 90s its first experiences in the M-Technologies and M-Libraries developments. In 1999 the appropriate technology offered the opportunity to carry out the first pilot test with SMS, and then applying the WAP technology. At those moments we managed to link-up mobile phones to the OPAC through a WAP system that allowed searching the catalogue by categories and finding the final location of a document, offering also the address of the library in which the user could loan it. Since then, UOC (and its library) directs its efforts towards adapting the offer of services to all sorts of M-devices used by end users. Left the WAP technology, nowadays the library is experimenting with some new devices like e-books, and some new services to get more feedback through the OPAC and metalibrary search products. We propose the case of Open University of Catalonia, in two levels: M-services applied in the library and M-technologies applied in some other university services and resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet is a fundamental part of the daily life of adolescents, they consider it as a safe and confidential source of information on health matters. The aims is to describe the experience of Spanish adolescents searching for health information on the Internet. Methods A cross-sectional study of 811 school-age adolescents in Granada was carried out. An adapted and piloted questionnaire was used which was controlled by trained personnel. Sociodemographic and health variables were included together with those concerning the conditions governing access to and use of information and communication technologies (ICT). Results 811 adolescents were surveyed (99.38% response rate), mean age was 17 years old. Of these, 88% used the Internet; 57.5% used it on a daily or weekly basis and 38.7% used it occasionally. Nearly half the sample group (55.7%) stated that they used the Internet to search for health-related information. The main problems reported in the search for e-health were the ignorance of good web pages (54.8%) and the lack of confidence or search skills (23.2%). Conclusions In conclusion, it seems plausible to claim that websites designed and managed by health services should have a predominant position among interventions specifically addressed to young people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aquest document de treball mira d'establir un nou camp d'investigació a la cruïlla entre els fluxos de migració i d'informació i comunicació. Hi ha diversos factors que fan que valgui la pena adoptar aquesta perspectiva. El punt central és que la migració internacional contemporània és incrustada en la dinàmica de la societat de la informació, seguint models comuns i dinàmiques interconnectades. Per consegüent, s'està començant a identificar els fluxos d'informació com a qüestions clau en les polítiques de migració. A més, hi ha una manca de coneixement empíric en el disseny de xarxes d'informació i l'ús de les tecnologies d'informació i comunicació en contextos migratoris. Aquest document de treball també mira de ser una font d'hipòtesis per a investigacions posteriors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the wealth of information generated by trans-disciplinary research in Chagas disease, knowledge about its multifaceted pathogenesis is still fragmented. Here we review the body of experimental studies in animal models supporting the concept that persistent infection by Trypanosoma cruzi is crucial for the development of chronic myocarditis. Complementing this review, we will make an effort to reconcile seemingly contradictory results concerning the immune profiles of chronic patients from Argentina and Brazil. Finally, we will review the results of molecular studies suggesting that parasite-induced inflammation and tissue damage is, at least in part, mediated by the activities of trans-sialidase, mucin-linked lipid anchors (TLR2 ligand) and cruzipain (a kinin-releasing cysteine protease). One hundred years after the discovery of Chagas disease, it is reassuring that basic and clinical research tends to converge, raising new perspectives for the treatment of chronic Chagas disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a novel structure from motion (SfM) approach able to infer 3D deformable models from uncalibrated stereo images. Using a stereo setup dramatically improves the 3D model estimation when the observed 3D shape is mostly deforming without undergoing strong rigid motion. Our approach first calibrates the stereo system automatically and then computes a single metric rigid structure for each frame. Afterwards, these 3D shapes are aligned to a reference view using a RANSAC method in order to compute the mean shape of the object and to select the subset of points on the object which have remained rigid throughout the sequence without deforming. The selected rigid points are then used to compute frame-wise shape registration and to extract the motion parameters robustly from frame to frame. Finally, all this information is used in a global optimization stage with bundle adjustment which allows to refine the frame-wise initial solution and also to recover the non-rigid 3D model. We show results on synthetic and real data that prove the performance of the proposed method even when there is no rigid motion in the original sequence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Andalusian Public Health System (Sistema Sanitario Público de Andalucía -SSPA) Repository is the open environment where all the scientific output generated by the SSPA professionals, resulting from their medical care, research and administrative activities, is comprehensively collected and managed. This repository possesses special features which determined its development: the SSPA organization and its purpose as a health institution, the specific sets of documents that it generates and the stakeholders involved in it. The repository uses DSpace 1.6.2, to which several changes were implemented in order to achieve the SSPA initial goals and requirements. The main changes were: the addition of specific qualifiers to the Metadata Dublin Core scheme, the modification of the submission form, the integration of the MeSH Thesaurus as controlled vocabulary and the optimization of the advanced search tool. Another key point during the setting up of the repository was the initial batch ingest of the documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En aquest treball, es proposa un nou mètode per estimar en temps real la qualitat del producte final en processos per lot. Aquest mètode permet reduir el temps necessari per obtenir els resultats de qualitat de les anàlisi de laboratori. S'utiliza un model de anàlisi de componentes principals (PCA) construït amb dades històriques en condicions normals de funcionament per discernir si un lot finalizat és normal o no. Es calcula una signatura de falla pels lots anormals i es passa a través d'un model de classificació per la seva estimació. L'estudi proposa un mètode per utilitzar la informació de les gràfiques de contribució basat en les signatures de falla, on els indicadors representen el comportament de les variables al llarg del procés en les diferentes etapes. Un conjunt de dades compost per la signatura de falla dels lots anormals històrics es construeix per cercar els patrons i entrenar els models de classifcació per estimar els resultas dels lots futurs. La metodologia proposada s'ha aplicat a un reactor seqüencial per lots (SBR). Diversos algoritmes de classificació es proven per demostrar les possibilitats de la metodologia proposada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food intake increases to a varying extent during pregnancy to provide extra energy for the growing fetus. Measuring the respiratory quotient (RQ) during the course of pregnancy (by quantifying O2 consumption and CO2 production with indirect calorimetry) could be potentially useful since it gives an insight into the evolution of the proportion of carbohydrate vs. fat oxidized during pregnancy and thus allows recommendations on macronutrients for achieving a balanced (or slightly positive) substrate intake. A systematic search of the literature for papers reporting RQ changes during normal pregnancy identified 10 papers reporting original research. The existing evidence supports an increased RQ of varying magnitude in the third trimester of pregnancy, while the discrepant results reported for the first and second trimesters (i.e. no increase in RQ), explained by limited statistical power (small sample size) or fragmentary data, preclude safe conclusions about the evolution of RQ during early pregnancy. From a clinical point of view, measuring RQ during pregnancy requires not only sophisticated and costly indirect calorimeters but appears of limited value outside pure research projects, because of several confounding variables: (1) spontaneous changes in food intake and food composition during the course of pregnancy (which influence RQ); (2) inter-individual differences in weight gain and composition of tissue growth; (3) technical factors, notwithstanding the relatively small contribution of fetal metabolism per se (RQ close to 1.0) to overall metabolism of the pregnant mother.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT Choice deferral due to information overload is an undesirable result of competitive environments. The neoclassical maximization models predict that choice avoidance will not increase as more information is offered to consumers. The theories developed in the consumer behavior field predict that some properties of the environment may lead to behavioral effects and an increase in choice avoidance due to information overload. Based on stimuli generated experimentally and tested among 1,000 consumers, this empirical research provides evidence for the presence of behavioral effects due to information overload and reveals the different effects of increasing the number of options or the number of attributes. This study also finds that the need for cognition moderates these behavioral effects, and it proposes psychological processes that may trigger the effects observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.