885 resultados para Context data
Resumo:
In the era of the Internet of Everything, a user with a handheld or wearable device equipped with sensing capability has become a producer as well as a consumer of information and services. The more powerful these devices get, the more likely it is that they will generate and share content locally, leading to the presence of distributed information sources and the diminishing role of centralized servers. As of current practice, we rely on infrastructure acting as an intermediary, providing access to the data. However, infrastructure-based connectivity might not always be available or the best alternative. Moreover, it is often the case where the data and the processes acting upon them are of local scopus. Answers to a query about a nearby object, an information source, a process, an experience, an ability, etc. could be answered locally without reliance on infrastructure-based platforms. The data might have temporal validity limited to or bounded to a geographical area and/or the social context where the user is immersed in. In this envisioned scenario users could interact locally without the need for a central authority, hence, the claim of an infrastructure-less, provider-less platform. The data is owned by the users and consulted locally as opposed to the current approach of making them available globally and stay on forever. From a technical viewpoint, this network resembles a Delay/Disruption Tolerant Network where consumers and producers might be spatially and temporally decoupled exchanging information with each other in an adhoc fashion. To this end, we propose some novel data gathering and dissemination strategies for use in urban-wide environments which do not rely on strict infrastructure mediation. While preserving the general aspects of our study and without loss of generality, we focus our attention toward practical applicative scenarios which help us capture the characteristics of opportunistic communication networks.
Resumo:
Die vorliegende Arbeit befasst sich mit der Synthese und Charakterisierung von Polymeren mit redox-funktionalen Phenothiazin-Seitenketten. Phenothiazin und seine Derivate sind kleine Redoxeinheiten, deren reversibles Redoxverhalten mit electrochromen Eigenschaften verbunden ist. Das besondere an Phenothiazine ist die Bildung von stabilen Radikalkationen im oxidierten Zustand. Daher können Phenothiazine als bistabile Moleküle agieren und zwischen zwei stabilen Redoxzuständen wechseln. Dieser Schaltprozess geht gleichzeitig mit einer Farbveränderung an her.rnrnIm Rahmen dieser Arbeit wird die Synthese neuartiger Phenothiazin-Polymere mittels radikalischer Polymerisation beschrieben. Phenothiazin-Derivate wurden kovalent an aliphatischen und aromatischen Polymerketten gebunden. Dies erfolgte über zwei unterschiedlichen synthetischen Routen. Die erste Route beinhaltet den Einsatz von Vinyl-Monomeren mit Phenothiazin Funktionalität zur direkten Polymerisation. Die zweite Route verwendet Amin modifizierte Phenothiazin-Derivate zur Funktionalisierung von Polymeren mit Aktivester-Seitenketten in einer polymeranalogen Reaktion. rnrnPolymere mit redox-funktionalen Phenothiazin-Seitenketten sind aufgrund ihrer Elektron-Donor-Eigenschaften geeignete Kandidaten für die Verwendung als Kathodenmaterialien. Zur Überprüfung ihrer Eignung wurden Phenothiazin-Polymere als Elektrodenmaterialien in Lithium-Batteriezellen eingesetzt. Die verwendeten Polymere wiesen gute Kapazitätswerte von circa 50-90 Ah/kg sowie schnelle Aufladezeiten in der Batteriezelle auf. Besonders die Aufladezeiten sind 5-10 mal höher als konventionelle Lithium-Batterien. Im Hinblick auf Anzahl der Lade- und Entladezyklen, erzielten die Polymere gute Werte in den Langzeit-Stabilitätstests. Insgesamt überstehen die Polymere 500 Ladezyklen mit geringen Veränderungen der Anfangswerte bezüglich Ladezeiten und -kapazitäten. Die Langzeit-Stabilität hängt unmittelbar mit der Radikalstabilität zusammen. Eine Stabilisierung der Radikalkationen gelang durch die Verlängerung der Seitenkette am Stickstoffatom des Phenothiazins und der Polymerhauptkette. Eine derartige Alkyl-Substitution erhöht die Radikalstabilität durch verstärkte Wechselwirkung mit dem aromatischen Ring und verbessert somit die Batterieleistung hinsichtlich der Stabilität gegenüber Lade- und Entladezyklen. rnrnDes Weiteren wurde die praktische Anwendung von bistabilen Phenothiazin-Polymeren als Speichermedium für hohe Datendichten untersucht. Dazu wurden dünne Filme des Polymers auf leitfähigen Substraten elektrochemisch oxidiert. Die elektrochemische Oxidation erfolgte mittels Rasterkraftmikroskopie in Kombination mit leitfähigen Mikroskopspitzen. Mittels dieser Technik gelang es, die Oberfläche des Polymers im nanoskaligen Bereich zu oxidieren und somit die lokale Leitfähigkeit zu verändern. Damit konnten unterschiedlich große Muster lithographisch beschrieben und aufgrund der Veränderung ihrer Leitfähigkeit detektiert werden. Der Schreibprozess führte nur zu einer Veränderung der lokalen Leitfähigkeit ohne die topographische Beschaffenheit des Polymerfilms zu beeinflussen. Außerdem erwiesen sich die Muster als besonders stabil sowohl mechanisch als auch über die Zeit.rnrnZum Schluss wurden neue Synthesestrategien entwickelt um mechanisch stabile als auch redox-funktionale Oberflächen zu produzieren. Mit Hilfe der oberflächen-initiierten Atomtransfer-Radikalpolymerisation wurden gepfropfte Polymerbürsten mit redox-funktionalen Phenothiazin-Seitenketten hergestellt und mittels Röntgenmethoden und Rasterkraftmikroskopie analysiert. Eine der Synthesestrategien geht von gepfropften Aktivesterbürsten aus, die anschließend in einem nachfolgenden Schritt mit redox-funktionalen Gruppen modifiziert werden können. Diese Vorgehensweise ist besonders vielversprechend und erlaubt es unterschiedliche funktionelle Gruppen an den Aktivesterbürsten zu verankern. Damit können durch Verwendung von vernetzenden Gruppen neben den Redoxeigenschaften, die mechanische Stabilität solcher Polymerfilme optimiert werden. rn rn
Resumo:
In the last 10 years the number of mobile devices has grown rapidly. Each person usually brings at least two personal devices and researchers says that in a near future this number could raise up to ten devices per person. Moreover, all the devices are becoming more integrated to our life than in the past, therefore the amount of data exchanged increases accordingly to the improvement of people's lifestyle. This is what researchers call Internet of Things. Thus, in the future there will be more than 60 billions of nodes and the current infrastructure is not ready to keep track of all the exchanges of data between them. Therefore, infrastructure improvements have been proposed in the last years, like MobileIP and HIP in order to facilitate the exchange of packets in mobility, however none of them have been optimized for the purpose. In the last years, researchers from Mid Sweden University created The MediaSense Framework. Initially, this framework was based on the Chord protocol in order to route packets in a big network, but the most important change has been the introduction of PGrids in order to create the Overlay and the persistence. Thanks to this technology, a lookup in the trie takes up to 0.5*log(N), where N is the total number of nodes in the network. This result could be improved by further optimizations on the management of the nodes, for example by the dynamic creation of groups of nodes. Moreover, since the nodes move, an underlaying support for connectivity management is needed. SCTP has been selected as one of the most promising upcoming standards for simultaneous multiple connection's management.
A river runs through it - ancient DNA data on the neolithic populations of the Great Hungarian Plain
Resumo:
This thesis was part of a multidisciplinary research project funded by the German Research Foundation (“Bevölkerungsgeschichte des Karpatenbeckens in der Jungsteinzeit und ihr Einfluss auf die Besiedlung Mitteleuropas”, grant no. Al 287/10-1) aimed at elucidating the population history of the Carpathian Basin during the Neolithic. The Carpathian Basin was an important waypoint on the spread of the Neolithic from southeastern to central Europe. On the Great Hungarian Plain (Alföld), the first farming communities appeared around 6000 cal BC. They belonged to the Körös culture, which derived from the Starčevo-Körös-Criş complex in the northern Balkans. Around 5600 cal BC the Alföld-Linearbandkeramik (ALBK), so called due to its stylistic similarities with the Transdanubian and central European LBK, emerged in the northwestern Alföld. Following a short “classical phase”, the ALBK split into several regional subgroups during its later stages, but did not expand beyond the Great Hungarian Plain. Marking the beginning of the late Neolithic period, the Tisza culture first appeared in the southern Alföld around 5000 cal BC and subsequently spread into the central and northern Alföld. Together with the Herpály and Csőszhalom groups it was an integral part of the late Neolithic cultural landscape of the Alföld. Up until now, the Neolithic cultural succession on the Alföld has been almost exclusively studied from an archaeological point of view, while very little is known about the population genetic processes during this time period. The aim of this thesis was to perform ancient DNA (aDNA) analyses on human samples from the Alföld Neolithic and analyse the resulting mitochondrial population data to address the following questions: is there population continuity between the Central European Mesolithic hunter-gatherer metapopulation and the first farming communities on the Alföld? Is there genetic continuity from the early to the late Neolithic? Are there genetic as well as cultural differences between the regional groups of the ALBK? Additionally, the relationships between the Alföld and the neighbouring Transdanubian Neolithic as well as other European early farming communities were evaluated to gain insights into the genetic affinities of the Alföld Neolithic in a larger geographic context. 320 individuals were analysed for this study; reproducible mitochondrial haplogroup information (HVS-I and/or SNP data) could be obtained from 242 Neolithic individuals. According to the analyses, population continuity between hunter-gatherers and the Neolithic cultures of the Alföld can be excluded at any stage of the Neolithic. In contrast, there is strong evidence for population continuity from the early to the late Neolithic. All cultural groups on the Alföld were heavily shaped by the genetic substrate introduced into the Carpathian Basin during the early Neolithic by the Körös and Starčevo cultures. Accordingly, genetic differentiation between regional groups of the ALBK is not very pronounced. The Alföld cultures are furthermore genetically highly similar to the Transdanubian Neolithic cultures, probably due to common ancestry. In the wider European context, the Alföld Neolithic cultures also highly similar to the central European LBK, while they differ markedly from contemporaneous populations of the Iberian Peninsula and the Ukraine. Thus, the Körös culture, the ALBK and the Tisza culture can be regarded as part of a “genetic continuum” that links the Neolithic Carpathian Basin to central Europe and likely has its roots in the Starčevo -Körös-Criş complex of the northern Balkans.
Resumo:
Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm.
Resumo:
The occupant impact velocity (OIV) and acceleration severity index (ASI) are competing measures of crash severity used to assess occupant injury risk in full-scale crash tests involving roadside safety hardware, e.g. guardrail. Delta-V, or the maximum change in vehicle velocity, is the traditional metric of crash severity for real world crashes. This study compares the ability of the OIV, ASI, and delta-V to discriminate between serious and non-serious occupant injury in real world frontal collisions. Vehicle kinematics data from event data recorders (EDRs) were matched with detailed occupant injury information for 180 real world crashes. Cumulative probability of injury risk curves were generated using binary logistic regression for belted and unbelted data subsets. By comparing the available fit statistics and performing a separate ROC curve analysis, the more computationally intensive OIV and ASI were found to offer no significant predictive advantage over the simpler delta-V.
Resumo:
Jahnke and Asher explore workflows and methodologies at a variety of academic data curation sites, and Keralis delves into the academic milieu of library and information schools that offer instruction in data curation. Their conclusions point to the urgent need for a reliable and increasingly sophisticated professional cohort to support data-intensive research in our colleges, universities, and research centers.
Resumo:
Objectives: Previous research conducted in the late 1980s suggested that vehicle impacts following an initial barrier collision increase severe occupant injury risk. Now over 25years old, the data are no longer representative of the currently installed barriers or the present US vehicle fleet. The purpose of this study is to provide a present-day assessment of secondary collisions and to determine if current full-scale barrier crash testing criteria provide an indication of secondary collision risk for real-world barrier crashes. Methods: To characterize secondary collisions, 1,363 (596,331 weighted) real-world barrier midsection impacts selected from 13years (1997-2009) of in-depth crash data available through the National Automotive Sampling System (NASS) / Crashworthiness Data System (CDS) were analyzed. Scene diagram and available scene photographs were used to determine roadside and barrier specific variables unavailable in NASS/CDS. Binary logistic regression models were developed for second event occurrence and resulting driver injury. To investigate current secondary collision crash test criteria, 24 full-scale crash test reports were obtained for common non-proprietary US barriers, and the risk of secondary collisions was determined using recommended evaluation criteria from National Cooperative Highway Research Program (NCHRP) Report 350. Results: Secondary collisions were found to occur in approximately two thirds of crashes where a barrier is the first object struck. Barrier lateral stiffness, post-impact vehicle trajectory, vehicle type, and pre-impact tracking conditions were found to be statistically significant contributors to secondary event occurrence. The presence of a second event was found to increase the likelihood of a serious driver injury by a factor of 7 compared to cases with no second event present. The NCHRP Report 350 exit angle criterion was found to underestimate the risk of secondary collisions in real-world barrier crashes. Conclusions: Consistent with previous research, collisions following a barrier impact are not an infrequent event and substantially increase driver injury risk. The results suggest that using exit-angle based crash test criteria alone to assess secondary collision risk is not sufficient to predict second collision occurrence for real-world barrier crashes.
Resumo:
Background The release of quality data from acute care hospitals to the general public is based on the aim to inform the public, to provide transparency and to foster quality-based competition among providers. Due to the expected mechanisms of action and possibly the adverse consequences of public quality comparison, it is a controversial topic. The perspective of physicians and nurses is of particular importance in this context. They are mainly responsible for the collection of quality-control data, and are directly confronted with the results of public comparison. The research focus of this qualitative study was to discover what the views and opinions of the Swiss physicians and nurses were regarding these issues. It was investigated as to how the two professional groups appraised the opportunities as well as the risks of the release of quality data in Switzerland. Methods A qualitative approach was chosen to answer the research question. For data collection, four focus groups were conducted with physicians and nurses who were employed in Swiss acute care hospitals. Qualitative content analysis was applied to the data. Results The results revealed that both occupational groups had a very critical and negative attitude regarding the recent developments. The perceived risks were dominating their view. In summary, their main concerns were: the reduction of complexity, the one-sided focus on measurable quality variables, risk selection, the threat of data manipulation and the abuse of published information by the media. An additional concern was that the impression is given that the complex construct of quality can be reduced to a few key figures, and it that it is constructed from a false message which then influences society and politics. This critical attitude is associated with the different value system and the professional self-concept that both physicians and nurses have, in comparison to the underlying principles of a market-based economy and the economic orientation of health care business. Conclusions The critical and negative attitude of Swiss physicians and nurses must, under all conditions, be heeded to and investigated regarding its impact on work motivation and identification with the profession. At the same time, the two professional groups are obligated to reflect upon their critical attitude and take a proactive role in the development of appropriate quality indicators for the publication of quality data in Switzerland.
Resumo:
WE INVESTIGATED HOW WELL STRUCTURAL FEATURES such as note density or the relative number of changes in the melodic contour could predict success in implicit and explicit memory for unfamiliar melodies. We also analyzed which features are more likely to elicit increasingly confident judgments of "old" in a recognition memory task. An automated analysis program computed structural aspects of melodies, both independent of any context, and also with reference to the other melodies in the testset and the parent corpus of pop music. A few features predicted success in both memory tasks, which points to a shared memory component. However, motivic complexity compared to a large corpus of pop music had different effects on explicit and implicit memory. We also found that just a few features are associated with different rates of "old" judgments, whether the items were old or new. Rarer motives relative to the testset predicted hits and rarer motives relative to the corpus predicted false alarms. This data-driven analysis provides further support for both shared and separable mechanisms in implicit and explicit memory retrieval, as well as the role of distinctiveness in true and false judgments of familiarity.
Resumo:
The new knowledge environments of the digital age are oen described as places where we are all closely read, with our buying habits, location, and identities available to advertisers, online merchants, the government, and others through our use of the Internet. This is represented as a loss of privacy in which these entities learn about our activities and desires, using means that were unavailable in the pre-digital era. This article argues that the reciprocal nature of digital networks means 1) that the privacy issues that we face online are not radically different from those of the pre-Internet era, and 2) that we need to reconceive of close reading as an activity of which both humans and computer algorithms are capable.
Resumo:
Successful software systems cope with complexity by organizing classes into packages. However, a particular organization may be neither straightforward nor obvious for a given developer. As a consequence, classes can be misplaced, leading to duplicated code and ripple effects with minor changes effecting multiple packages. We claim that contextual information is the key to rearchitecture a system. Exploiting contextual information, we propose a technique to detect misplaced classes by analyzing how client packages access the classes of a given provider package. We define locality as a measure of the degree to which classes reused by common clients appear in the same package. We then use locality to guide a simulated annealing algorithm to obtain optimal placements of classes in packages. The result is the identification of classes that are candidates for relocation. We apply the technique to three applications and validate the usefulness of our approach via developer interviews.
Resumo:
Community research fatigue has been understudied within the context of community-university relationships and knowledge production. Community-based research (CBR), often occurring within a limited geography and population, increases the possibility that community members feel exhausted or over-whelmed by university research —particularly when they do not see tangible results from research activities. Prompted by informal stories of research fatigue from community members, a small graduate student team sought to understand the extent to which community members experienced research fatigue, and what factors contributed to or relieved feelings of research fatigue. In order to explore these dimensions of research fatigue, semi-structured, face-to-face interviews were conducted with 21 participants, including community members (n = 9), staff and faculty (n = 10), and students (n = 2). The objective of the research was to identify university practices that contribute to research fatigue and how to address the issue at the university level. Qualitative data analysis revealed several important actionable findings: the structure and conduct of community-based research, structured reciprocity and impact, and the role of trust in research. This study’s findings are used to assess the quality of Clark University’s research relationship with its adjacent community. Recommendations are offered; such as to improve partnerships, the impact of CBR, and to develop clear principles of practice.
Resumo:
Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.