921 resultados para Processing wikipedia data
Resumo:
In recent years, advanced metering infrastructure (AMI) has been the main research focus due to the traditional power grid has been restricted to meet development requirements. There has been an ongoing effort to increase the number of AMI devices that provide real-time data readings to improve system observability. Deployed AMI across distribution secondary networks provides load and consumption information for individual households which can improve grid management. Significant upgrade costs associated with retrofitting existing meters with network-capable sensing can be made more economical by using image processing methods to extract usage information from images of the existing meters. This thesis presents a new solution that uses online data exchange of power consumption information to a cloud server without modifying the existing electromechanical analog meters. In this framework, application of a systematic approach to extract energy data from images replaces the manual reading process. One case study illustrates the digital imaging approach is compared to the averages determined by visual readings over a one-month period.
Resumo:
The ability of cryogenic photonic crystals to carry out high performance microwave signal processing operations has been developed into systems that can: rapidly record broadband microwave spectra with fine resolution and high dynamic range; search for patterns in 40 gigabits per second data streams; and communicate via spread- spectrum signals that are well below the noise floor. The basic concepts of the technology and its many applications, along with an overview of university-industry partnerships and the growing photonics industry in Bozeman, will be presented.
Resumo:
Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.
Resumo:
Applying location-focused data protection law within the context of a location-agnostic cloud computing framework is fraught with difficulties. While the Proposed EU Data Protection Regulation has introduced a lot of changes to the current data protection framework, the complexities of data processing in the cloud involve various layers and intermediaries of actors that have not been properly addressed. This leaves some gaps in the regulation when analyzed in cloud scenarios. This paper gives a brief overview of the relevant provisions of the regulation that will have an impact on cloud transactions and addresses the missing links. It is hoped that these loopholes will be reconsidered before the final version of the law is passed in order to avoid unintended consequences.
Resumo:
In this paper, we investigate content-centric data transmission in the context of short opportunistic contacts and base our work on an existing content-centric networking architecture. In case of short interconnection times, file transfers may not be completed and the received information is discarded. Caches in content-centric networks are used for short-term storage and do not guarantee persistence. We implemented a mechanism to extend caching on persistent storage enabling the completion of disrupted content transfers. The mechanisms have been implemented in the CCNx framework and have been evaluated on wireless mesh nodes. Our evaluations using multicast and unicast communication show that the implementation can support content transfers in opportunistic environments without significant processing and storing overhead.
Resumo:
The Gravity field and steady-state Ocean Circulation Explorer (GOCE) is now in orbit for more than four years. This is longer than the originally planned lifetime of the satellite and after three years on the same altitude the satellite has been lowered to 235 km in several steps. In the frame of the GOCE High-level Processing Facility the Astronomical Institute of the University of Bern (AIUB) is responsible for the determination of the official Precise Science Orbit (PSO) product. Kinematic GOCE orbits are part of this product and are used by several institutions in- and outside the HPF for determining the low degrees of the Earth’s gravity field. AIUB GOCE GPS-only gravity field solutions using the Celestial Mechanics Approach and covering the Release 4 period as well as a more recent time interval at the lower orbit altitude are shown and discussed. Special attention is paid to the impact of systematic deficiencies in the kinematic orbits on the resulting gravity fields, e.g., related to the geomagnetic equator, and on possibilities to get rid of them.
Resumo:
This paper considers a framework where data from correlated sources are transmitted with the help of network coding in ad hoc network topologies. The correlated data are encoded independently at sensors and network coding is employed in the intermediate nodes in order to improve the data delivery performance. In such settings, we focus on the problem of reconstructing the sources at decoder when perfect decoding is not possible due to losses or bandwidth variations. We show that the source data similarity can be used at decoder to permit decoding based on a novel and simple approximate decoding scheme. We analyze the influence of the network coding parameters and in particular the size of finite coding fields on the decoding performance. We further determine the optimal field size that maximizes the expected decoding performance as a trade-off between information loss incurred by limiting the resolution of the source data and the error probability in the reconstructed data. Moreover, we show that the performance of the approximate decoding improves when the accuracy of the source model increases even with simple approximate decoding techniques. We provide illustrative examples showing how the proposed algorithm can be deployed in sensor networks and distributed imaging applications.
Resumo:
Traditionally, ontologies describe knowledge representation in a denotational, formalized, and deductive way. In addition, in this paper, we propose a semiotic, inductive, and approximate approach to ontology creation. We define a conceptual framework, a semantics extraction algorithm, and a first proof of concept applying the algorithm to a small set of Wikipedia documents. Intended as an extension to the prevailing top-down ontologies, we introduce an inductive fuzzy grassroots ontology, which organizes itself organically from existing natural language Web content. Using inductive and approximate reasoning to reflect the natural way in which knowledge is processed, the ontology’s bottom-up build process creates emergent semantics learned from the Web. By this means, the ontology acts as a hub for computing with words described in natural language. For Web users, the structural semantics are visualized as inductive fuzzy cognitive maps, allowing an initial form of intelligence amplification. Eventually, we present an implementation of our inductive fuzzy grassroots ontology Thus,this paper contributes an algorithm for the extraction of fuzzy grassroots ontologies from Web data by inductive fuzzy classification.
Resumo:
Recognizing the increasing amount of information shared on Social Networking Sites (SNS), in this study we aim to explore the information processing strategies of users on Facebook. Specifically, we aim to investigate the impact of various factors on user attitudes towards the posts on their Newsfeed. To collect the data, we program a Facebook application that allows users to evaluate posts in real time. Applying Structural Equation Modeling to a sample of 857 observations we find that it is mostly the affective attitude that shapes user behavior on the network. This attitude, in turn, is mainly determined by the communication intensity between users, overriding comprehensibility of the post and almost neglecting post length and user posting frequency.
Resumo:
Human pro-TNF-$\alpha$ is a 26 kd type II transmembrane protein, and it is the precursor of 17 kd mature TNF. Pro-TNF release mature from its extracellular domain by proteolytic cleavage between resideu Ava ($-$1) and Val (+1). Both forms of TNF are biologically active and the native form of mature TNF is a bell-shaped trimer. The structure of pro-TNF was studied both in intact cell system and in an in vitro translation system by chemical crosslinking. We found that human pro-TNF protein exist as a trimer in intact cells (LPS-induced THP-1 cells and TNF cDNA transfected COS-3 cells) and this trimeric structure is assembled intracellularly, possibly in the ER. By analysis several deletion mutants, we observed a correlation between expression of pro-TNF cytotoxicity in a juxtacrine fashion and detection of the trimer, suggesting the trimeric structure is very important for its biologic activity. With a series of deletion mutants in the linking domain, we found that the small deletion did not block the cleavage and large deletion did regardless of the presence or absence of the native cleavage site, suggesting that the length of the residues between the plasma membrane and the base of the trimer determines the rate of the cleavage, possibly by blocking the accessibility of the cleavage enzyme to its action site. Our data also suggest that the native cleavage site is not sufficient for the release of mature TNF and alternative cleavage site(s) exists. ^
Resumo:
Wilms tumor (WT) is an embryonal renal tumor with a heterogeneous genetic etiology that serves as a valuable model for studying tumorigenesis. Biallelic inactivation of the tumor suppressor gene WT1, a zinc-finger transcriptional regulator located at 11p13, is critical for the development of some Wilms tumors. Interestingly, WT1 genomic analysis has demonstrated mutations in less than 20% of WT cases. This suggests either other genes play a more major role in Wilms tumorigenesis or WT1 is functionally altered by mechanisms other than DNA mutation. Previous observations in rat and in WT xenograft cell lines have suggested that abnormal WT1 RNA processing (exon 6 RNA editing and aberrant exon 2 splicing, respectively) is a potential mechanism of altering WT1 function in the absence of a WT1 DNA mutation. However, the role of this abnormal RNA processing has not previously been assessed in primary Wilms tumors. ^ To test the hypothesis that abnormal WT1 RNA processing is a mechanism of WT1alteration during tumor development, WT1 RNA from 85 primary tumors was analyzed using reverse transcription and polymerase chain reaction amplification (RT-PCR). Although no evidence for WT1 RNA editing was observed, variable levels (5% to 50%) of aberrant WT1 exon 2 splicing were detected for 11 tumors in the absence of a detectable WT1 DNA mutation. Also, alteration of normal WT1 alternative splicing, observed as RNA isoform loss, was detected in five tumors with no apparent WT1 genomic alteration, although no consistent pattern of RNA isoform loss was detected. This abnormal WT1 splicing, detected by either loss of exon 2 from some of the transcripts or loss of RNA isoforms, is statistically correlated with relapse (p = 0.005). These studies demonstrate that abnormal WT1 RNA processing is not a common mechanism of abrogating normal WT1 function in primary tumors. However, in those cases in which abnormal WTI splicing is present, these data indicate that it may serve as a useful prognostic marker for relapse in WT patients. ^
Resumo:
We report a trace element - Pb isotope analytical (LIA) database on the "Singen Copper", a peculiar type of copper found in the North Alpine realm, from its type locality, the Early Bronze Age Singen Cemetery (Germany). What distinguishes “Singen Copper” from other coeval copper types? (i) is it a discrete metal lot with a uniform provenance (if so, can its provenance be constrained)? (ii) was it manufactured by a special, unique metallurgical process that can be discriminated from others? Trace element concentrations can give clues on the ore types that were mined, but they can be modified (more or less intentionally) by metallurgical operations. A more robust indicator are the ratios of chemically similar elements (e.g. Co/Ni, Bi/Sb, etc.), since they should remain nearly constant during metallurgical operations, and are expected to behave homogeneously in each mineral of a given mining area, but their partition amongst the different mineral species is known to cause strong inter-element fractionations. We tested the trace element ratio pattern predicted by geochemical arguments on the Brixlegg mining area. Brixlegg itself is not compatible with the Singen Copper objects, and we only report it because it is a rare instance of a mining area for which sufficient trace element analyses are available in the literature. We observe that As/Sb in fahlerz varies by a factor 1.8 above/below median; As/Sb in enargite varies by a factor of 2.5 with a 10 times higher median. Most of the 102 analyzed metal objects from Singen are Sb-Ni-rich, corresponding to “antimony-nickel copper” of the literature. Other trace element concentrations vary by > 100 times, ratios by factors > 50. Pb isotopic compositions are all significantly different from each other. They do not form a single linear array and require > 3 ore batches that certainly do not derive from one single mining area. Our data suggest a heterogeneous provenance of “Singen copper”. Archaeological information limits the scope to Central European sources. LIA requires a diverse supply network from many mining localities, including possibly Brittany. Trace element ratios show more heterogeneity than LIA; this can be explained either by deliberate selection of one particular ore mineral (from very many sources) or by processing of assorted ore minerals from a smaller number of sources, with the unintentional effect that the quality of the copper would not be constant, as the metallurgical properties of alloys would vary with trace element concentrations.
Resumo:
The quick identification of potentially threatening events is a crucial cognitive capacity to survive in a changing environment. Previous functional MRI data revealed the right dorsolateral prefrontal cortex and the region of the left intraparietal sulcus (IPS) to be involved in the perception of emotionally negative stimuli. For assessing chronometric aspects of emotion processing, we applied transcranial magnetic stimulation above these areas at different times after negative and neutral picture presentation. An interference with emotion processing was found with transcranial magnetic stimulation above the dorsolateral prefrontal cortex 200-300 ms and above the left intraparietal sulcus 240/260 ms after negative stimuli. The data suggest a parallel and conjoint involvement of prefrontal and parietal areas for the identification of emotionally negative stimuli.
Resumo:
The Lasail mining area (Sultanate of Oman) was contaminated by acid mine drainage during the exploitation and processing of local and imported copper ore and the subsequent deposition of sulphide-bearing waste material into an unsealed tailings dump. In this arid environment, the use of seawater in the initial stages of ore processing caused saline contamination of the fresh groundwater downstream of the tailings dump. After detection of the contamination in the 1980s, different source-controlled remediation activities were conducted including a seepage water collection system and, in 2005, surface sealing of the tailings dump using an HDPE-liner to prevent further infiltration of meteoric water. We have been assessing the benefits of the remediation actions undertaken so far. We present chemical and isotopic (δ18O, δ 2H, 3H) groundwater data from a long-term survey (8–16 years) of the Wadi Suq aquifer along a 28 km profile from the tailings dump to the Gulf of Oman. Over this period, most metal concentrations in the Wadi Suq groundwater decreased below detection limits. In addition, in the first boreholes downstream of the tailings pond, the salinity contamination has decreased by 30 % since 2005. This decrease appears to be related to the surface coverage of the tailings pond, which reduces flushing of the tailings by the sporadic, but commonly heavy, precipitation events. Despite generally low metal concentrations and the decreased salinity, groundwater quality still does not meet the WHO drinking water guidelines in more than 90 % of the Wadi Suq aquifer area. The observations show that under arid conditions, use of seawater for ore processing or any other industrial activity has the potential to contaminate aquifers for decades.
Resumo:
While most healthy elderly are able to manage their everyday activities, studies showed that there are both stable and declining abilities during healthy aging. For example, there is evidence that semantic memory processes which involve controlled retrieval mechanism decrease, whereas the automatic functioning of the semantic network remains intact. In contrast, patients with Alzheimer’s disease (AD) suffer from episodic and semantic memory impairments aggravating their daily functioning. In AD, severe episodic as well as semantic memory deficits are observable. While the hallmark symptom of episodic memory decline in AD is well investigated, the underlying mechanisms of semantic memory deterioration remain unclear. By disentangling the semantic memory impairments in AD, the present thesis aimed to improve early diagnosis and to find a biomarker for dementia. To this end, a study on healthy aging and a study with dementia patients were conducted investigating automatic and controlled semantic word retrieval. Besides the inclusion of AD patients, a group of participants diagnosed with semantic dementia (SD) – showing isolated semantic memory loss – was assessed. Automatic and controlled semantic word retrieval was measured with standard neuropsychological tests and by means of event-related potentials (ERP) recorded during the performance of a semantic priming (SP) paradigm. Special focus was directed to the N400 or N400-LPC (late positive component) complex, an ERP that is sensitive to the semantic word retrieval. In both studies, data driven topographical analyses were applied. Furthermore, in the patient study, the combination of the individual baseline cerebral blood flow (CBF) with the N400 topography of each participant was employed in order to relate altered functional electrophysiology to the pathophysiology of dementia. Results of the aging study revealed that the automatic semantic word retrieval remains stable during healthy aging, the N400-LPC complex showed a comparable topography in contrast to the young participants. Both patient groups showed automatic SP to some extent, but strikingly the ERP topographies were altered compared to healthy controls. Most importantly, the N400 was identified as a putative marker for dementia. In particular, the degree of the topographical N400 similarity was demonstrated to separate healthy elderly from demented patients. Furthermore, the marker was significantly related to baseline CBF reduction in brain areas relevant for semantic word retrieval. Summing up, the first major finding of the present thesis was that all groups showed semantic priming, but that the N400 topography differed significantly between healthy and demented elderly. The second major contribution was the identification of the N400 similarity as a putative marker for dementia. To conclude, the present thesis added evidence of preserved automatic processing during healthy aging. Moreover, a possible marker which might contribute to an improved diagnosis and lead consequently to a more effective treatment of dementia was presented and has to be further developed.