904 resultados para Context data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT AND OBJECTIVE: Epidemiology may help educators to face the challenge of establishing content guidelines for the curricula in medical schools. The aim was to develop learning objectives for a medical curriculum from an epidemiology database. DESIGN AND SETTING: Descriptive study assessing morbidity and mortality data, conducted in a private university in São Paulo. METHODS: An epidemiology database was used, with mortality and morbidity recorded as summaries of deaths and the World Health Organization's Disability-Adjusted Life Year (DALY). The scoring took into consideration probabilities for mortality and morbidity. RESULTS: The scoring presented a classification of health conditions to be used by a curriculum design committee, taking into consideration its highest and lowest quartiles, which corresponded respectively to the highest and lowest impact on morbidity and mortality. Data from three countries were used for international comparison and showed distinct results. The resulting scores indicated topics to be developed through educational taxonomy. CONCLUSION: The frequencies of the health conditions and their statistical treatment made it possible to identify topics that should be fully developed within medical education. The classification also suggested limits between topics that should be developed in depth, including knowledge and development of skills and attitudes, regarding topics that can be concisely presented at the level of knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Ecosystem Approach to Fisheries represents the most recent research line in the international context, showing interest both towards the whole community and toward the identification and protection of all the “critical habitats” in which marine resources complete their life cycles. Using data coming from trawl surveys performed in the Northern and Central Adriatic from 1996 to 2010, this study provides the first attempt to appraise the status of the whole demersal community. It took into account not only fishery target species but also by-catch and discharge species by the use of a suite of biological indicators both at population and multi-specific level, allowing to have a global picture of the status of the demersal system. This study underlined the decline of extremely important species for the Adriatic fishery in recent years; adverse impact on catches is expected for these species in the coming years, since also minimum values of recruits recently were recorded. Both the excessive exploitation and environmental factors affected availability of resources. Moreover both distribution and nursery areas of the most important resources were pinpointed by means of geostatistical methods. The geospatial analysis also confirmed the presence of relevant recruitment areas in the North and Central Adriatic for several commercial species, as reported in the literature. The morphological and oceanographic features, the relevant rivers inflow together with the mosaic pattern of biocenoses with different food availability affected the location of the observed relevant nursery areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to apply multilevel regression model in context of household surveys. Hierarchical structure in this type of data is characterized by many small groups. In last years comparative and multilevel analysis in the field of perceived health have grown in size. The purpose of this thesis is to develop a multilevel analysis with three level of hierarchy for Physical Component Summary outcome to: evaluate magnitude of within and between variance at each level (individual, household and municipality); explore which covariates affect on perceived physical health at each level; compare model-based and design-based approach in order to establish informativeness of sampling design; estimate a quantile regression for hierarchical data. The target population are the Italian residents aged 18 years and older. Our study shows a high degree of homogeneity within level 1 units belonging from the same group, with an intraclass correlation of 27% in a level-2 null model. Almost all variance is explained by level 1 covariates. In fact, in our model the explanatory variables having more impact on the outcome are disability, unable to work, age and chronic diseases (18 pathologies). An additional analysis are performed by using novel procedure of analysis :"Linear Quantile Mixed Model", named "Multilevel Linear Quantile Regression", estimate. This give us the possibility to describe more generally the conditional distribution of the response through the estimation of its quantiles, while accounting for the dependence among the observations. This has represented a great advantage of our models with respect to classic multilevel regression. The median regression with random effects reveals to be more efficient than the mean regression in representation of the outcome central tendency. A more detailed analysis of the conditional distribution of the response on other quantiles highlighted a differential effect of some covariate along the distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates context-aware wireless networks, capable to adapt their behavior to the context and the application, thanks to the ability of combining communication, sensing and localization. Problems of signals demodulation, parameters estimation and localization are addressed exploiting analytical methods, simulations and experimentation, for the derivation of the fundamental limits, the performance characterization of the proposed schemes and the experimental validation. Ultrawide-bandwidth (UWB) signals are in certain cases considered and non-coherent receivers, allowing the exploitation of the multipath channel diversity without adopting complex architectures, investigated. Closed-form expressions for the achievable bit error probability of novel proposed architectures are derived. The problem of time delay estimation (TDE), enabling network localization thanks to ranging measurement, is addressed from a theoretical point of view. New fundamental bounds on TDE are derived in the case the received signal is partially known or unknown at receiver side, as often occurs due to propagation or due to the adoption of low-complexity estimators. Practical estimators, such as energy-based estimators, are revised and their performance compared with the new bounds. The localization issue is addressed with experimentation for the characterization of cooperative networks. Practical algorithms able to improve the accuracy in non-line-of-sight (NLOS) channel conditions are evaluated on measured data. With the purpose of enhancing the localization coverage in NLOS conditions, non-regenerative relaying techniques for localization are introduced and ad hoc position estimators are devised. An example of context-aware network is given with the study of the UWB-RFID system for detecting and locating semi-passive tags. In particular a deep investigation involving low-complexity receivers capable to deal with problems of multi-tag interference, synchronization mismatches and clock drift is presented. Finally, theoretical bounds on the localization accuracy of this and others passive localization networks (e.g., radar) are derived, also accounting for different configurations such as in monostatic and multistatic networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the era of the Internet of Everything, a user with a handheld or wearable device equipped with sensing capability has become a producer as well as a consumer of information and services. The more powerful these devices get, the more likely it is that they will generate and share content locally, leading to the presence of distributed information sources and the diminishing role of centralized servers. As of current practice, we rely on infrastructure acting as an intermediary, providing access to the data. However, infrastructure-based connectivity might not always be available or the best alternative. Moreover, it is often the case where the data and the processes acting upon them are of local scopus. Answers to a query about a nearby object, an information source, a process, an experience, an ability, etc. could be answered locally without reliance on infrastructure-based platforms. The data might have temporal validity limited to or bounded to a geographical area and/or the social context where the user is immersed in. In this envisioned scenario users could interact locally without the need for a central authority, hence, the claim of an infrastructure-less, provider-less platform. The data is owned by the users and consulted locally as opposed to the current approach of making them available globally and stay on forever. From a technical viewpoint, this network resembles a Delay/Disruption Tolerant Network where consumers and producers might be spatially and temporally decoupled exchanging information with each other in an adhoc fashion. To this end, we propose some novel data gathering and dissemination strategies for use in urban-wide environments which do not rely on strict infrastructure mediation. While preserving the general aspects of our study and without loss of generality, we focus our attention toward practical applicative scenarios which help us capture the characteristics of opportunistic communication networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Arbeit befasst sich mit der Synthese und Charakterisierung von Polymeren mit redox-funktionalen Phenothiazin-Seitenketten. Phenothiazin und seine Derivate sind kleine Redoxeinheiten, deren reversibles Redoxverhalten mit electrochromen Eigenschaften verbunden ist. Das besondere an Phenothiazine ist die Bildung von stabilen Radikalkationen im oxidierten Zustand. Daher können Phenothiazine als bistabile Moleküle agieren und zwischen zwei stabilen Redoxzuständen wechseln. Dieser Schaltprozess geht gleichzeitig mit einer Farbveränderung an her.rnrnIm Rahmen dieser Arbeit wird die Synthese neuartiger Phenothiazin-Polymere mittels radikalischer Polymerisation beschrieben. Phenothiazin-Derivate wurden kovalent an aliphatischen und aromatischen Polymerketten gebunden. Dies erfolgte über zwei unterschiedlichen synthetischen Routen. Die erste Route beinhaltet den Einsatz von Vinyl-Monomeren mit Phenothiazin Funktionalität zur direkten Polymerisation. Die zweite Route verwendet Amin modifizierte Phenothiazin-Derivate zur Funktionalisierung von Polymeren mit Aktivester-Seitenketten in einer polymeranalogen Reaktion. rnrnPolymere mit redox-funktionalen Phenothiazin-Seitenketten sind aufgrund ihrer Elektron-Donor-Eigenschaften geeignete Kandidaten für die Verwendung als Kathodenmaterialien. Zur Überprüfung ihrer Eignung wurden Phenothiazin-Polymere als Elektrodenmaterialien in Lithium-Batteriezellen eingesetzt. Die verwendeten Polymere wiesen gute Kapazitätswerte von circa 50-90 Ah/kg sowie schnelle Aufladezeiten in der Batteriezelle auf. Besonders die Aufladezeiten sind 5-10 mal höher als konventionelle Lithium-Batterien. Im Hinblick auf Anzahl der Lade- und Entladezyklen, erzielten die Polymere gute Werte in den Langzeit-Stabilitätstests. Insgesamt überstehen die Polymere 500 Ladezyklen mit geringen Veränderungen der Anfangswerte bezüglich Ladezeiten und -kapazitäten. Die Langzeit-Stabilität hängt unmittelbar mit der Radikalstabilität zusammen. Eine Stabilisierung der Radikalkationen gelang durch die Verlängerung der Seitenkette am Stickstoffatom des Phenothiazins und der Polymerhauptkette. Eine derartige Alkyl-Substitution erhöht die Radikalstabilität durch verstärkte Wechselwirkung mit dem aromatischen Ring und verbessert somit die Batterieleistung hinsichtlich der Stabilität gegenüber Lade- und Entladezyklen. rnrnDes Weiteren wurde die praktische Anwendung von bistabilen Phenothiazin-Polymeren als Speichermedium für hohe Datendichten untersucht. Dazu wurden dünne Filme des Polymers auf leitfähigen Substraten elektrochemisch oxidiert. Die elektrochemische Oxidation erfolgte mittels Rasterkraftmikroskopie in Kombination mit leitfähigen Mikroskopspitzen. Mittels dieser Technik gelang es, die Oberfläche des Polymers im nanoskaligen Bereich zu oxidieren und somit die lokale Leitfähigkeit zu verändern. Damit konnten unterschiedlich große Muster lithographisch beschrieben und aufgrund der Veränderung ihrer Leitfähigkeit detektiert werden. Der Schreibprozess führte nur zu einer Veränderung der lokalen Leitfähigkeit ohne die topographische Beschaffenheit des Polymerfilms zu beeinflussen. Außerdem erwiesen sich die Muster als besonders stabil sowohl mechanisch als auch über die Zeit.rnrnZum Schluss wurden neue Synthesestrategien entwickelt um mechanisch stabile als auch redox-funktionale Oberflächen zu produzieren. Mit Hilfe der oberflächen-initiierten Atomtransfer-Radikalpolymerisation wurden gepfropfte Polymerbürsten mit redox-funktionalen Phenothiazin-Seitenketten hergestellt und mittels Röntgenmethoden und Rasterkraftmikroskopie analysiert. Eine der Synthesestrategien geht von gepfropften Aktivesterbürsten aus, die anschließend in einem nachfolgenden Schritt mit redox-funktionalen Gruppen modifiziert werden können. Diese Vorgehensweise ist besonders vielversprechend und erlaubt es unterschiedliche funktionelle Gruppen an den Aktivesterbürsten zu verankern. Damit können durch Verwendung von vernetzenden Gruppen neben den Redoxeigenschaften, die mechanische Stabilität solcher Polymerfilme optimiert werden. rn rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last 10 years the number of mobile devices has grown rapidly. Each person usually brings at least two personal devices and researchers says that in a near future this number could raise up to ten devices per person. Moreover, all the devices are becoming more integrated to our life than in the past, therefore the amount of data exchanged increases accordingly to the improvement of people's lifestyle. This is what researchers call Internet of Things. Thus, in the future there will be more than 60 billions of nodes and the current infrastructure is not ready to keep track of all the exchanges of data between them. Therefore, infrastructure improvements have been proposed in the last years, like MobileIP and HIP in order to facilitate the exchange of packets in mobility, however none of them have been optimized for the purpose. In the last years, researchers from Mid Sweden University created The MediaSense Framework. Initially, this framework was based on the Chord protocol in order to route packets in a big network, but the most important change has been the introduction of PGrids in order to create the Overlay and the persistence. Thanks to this technology, a lookup in the trie takes up to 0.5*log(N), where N is the total number of nodes in the network. This result could be improved by further optimizations on the management of the nodes, for example by the dynamic creation of groups of nodes. Moreover, since the nodes move, an underlaying support for connectivity management is needed. SCTP has been selected as one of the most promising upcoming standards for simultaneous multiple connection's management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis was part of a multidisciplinary research project funded by the German Research Foundation (“Bevölkerungsgeschichte des Karpatenbeckens in der Jungsteinzeit und ihr Einfluss auf die Besiedlung Mitteleuropas”, grant no. Al 287/10-1) aimed at elucidating the population history of the Carpathian Basin during the Neolithic. The Carpathian Basin was an important waypoint on the spread of the Neolithic from southeastern to central Europe. On the Great Hungarian Plain (Alföld), the first farming communities appeared around 6000 cal BC. They belonged to the Körös culture, which derived from the Starčevo-Körös-Criş complex in the northern Balkans. Around 5600 cal BC the Alföld-Linearbandkeramik (ALBK), so called due to its stylistic similarities with the Transdanubian and central European LBK, emerged in the northwestern Alföld. Following a short “classical phase”, the ALBK split into several regional subgroups during its later stages, but did not expand beyond the Great Hungarian Plain. Marking the beginning of the late Neolithic period, the Tisza culture first appeared in the southern Alföld around 5000 cal BC and subsequently spread into the central and northern Alföld. Together with the Herpály and Csőszhalom groups it was an integral part of the late Neolithic cultural landscape of the Alföld. Up until now, the Neolithic cultural succession on the Alföld has been almost exclusively studied from an archaeological point of view, while very little is known about the population genetic processes during this time period. The aim of this thesis was to perform ancient DNA (aDNA) analyses on human samples from the Alföld Neolithic and analyse the resulting mitochondrial population data to address the following questions: is there population continuity between the Central European Mesolithic hunter-gatherer metapopulation and the first farming communities on the Alföld? Is there genetic continuity from the early to the late Neolithic? Are there genetic as well as cultural differences between the regional groups of the ALBK? Additionally, the relationships between the Alföld and the neighbouring Transdanubian Neolithic as well as other European early farming communities were evaluated to gain insights into the genetic affinities of the Alföld Neolithic in a larger geographic context. 320 individuals were analysed for this study; reproducible mitochondrial haplogroup information (HVS-I and/or SNP data) could be obtained from 242 Neolithic individuals. According to the analyses, population continuity between hunter-gatherers and the Neolithic cultures of the Alföld can be excluded at any stage of the Neolithic. In contrast, there is strong evidence for population continuity from the early to the late Neolithic. All cultural groups on the Alföld were heavily shaped by the genetic substrate introduced into the Carpathian Basin during the early Neolithic by the Körös and Starčevo cultures. Accordingly, genetic differentiation between regional groups of the ALBK is not very pronounced. The Alföld cultures are furthermore genetically highly similar to the Transdanubian Neolithic cultures, probably due to common ancestry. In the wider European context, the Alföld Neolithic cultures also highly similar to the central European LBK, while they differ markedly from contemporaneous populations of the Iberian Peninsula and the Ukraine. Thus, the Körös culture, the ALBK and the Tisza culture can be regarded as part of a “genetic continuum” that links the Neolithic Carpathian Basin to central Europe and likely has its roots in the Starčevo -Körös-Criş complex of the northern Balkans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The occupant impact velocity (OIV) and acceleration severity index (ASI) are competing measures of crash severity used to assess occupant injury risk in full-scale crash tests involving roadside safety hardware, e.g. guardrail. Delta-V, or the maximum change in vehicle velocity, is the traditional metric of crash severity for real world crashes. This study compares the ability of the OIV, ASI, and delta-V to discriminate between serious and non-serious occupant injury in real world frontal collisions. Vehicle kinematics data from event data recorders (EDRs) were matched with detailed occupant injury information for 180 real world crashes. Cumulative probability of injury risk curves were generated using binary logistic regression for belted and unbelted data subsets. By comparing the available fit statistics and performing a separate ROC curve analysis, the more computationally intensive OIV and ASI were found to offer no significant predictive advantage over the simpler delta-V.