933 resultados para heterogeneous computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spectacular images of Comet 103P/Hartley 2 recorded by the Medium Resolution Instrument (MRI) and High Resolution Instrument (HRI) on board of the Extrasolar Planet Observation and Deep Impact Extended Investigation (EPOXI) spacecraft, as the Deep Impact extended mission, revealed that its bi-lobed very active nucleus outgasses volatiles heterogeneously. Indeed, CO2 is the primary driver of activity by dragging out chunks of pure ice out of the nucleus from the sub-solar lobe that appear to be the main source of water in Hartley 2's coma by sublimating slowly as they go away from the nucleus. However, water vapor is released by direct sublimation of the nucleus at the waist without any significant amount of either CO2 or icy grains. The coma structure for a comet with such areas of diverse chemistry differs from the usual models where gases are produced in a homogeneous way from the surface. We use the fully kinetic Direct Simulation Monte Carlo model of Tenishev et al. (Tenishev, V.M., Combi, M.R., Davidsson, B. [2008]. Astrophys. J. 685, 659-677; Tenishev, V.M., Combi, M.R., Rubin, M. [2011]. Astrophys. J. 732, 104-120) applied to Comet 103P/Hartley 2 including sublimating icy grains to reproduce the observations made by EPOXI and ground-based measurements. A realistic bi-lobed nucleus with a succession of active areas with different chemistry was included in the model enabling us to study in details the coma of Hartley 2. The different gas production rates from each area were found by fitting the spectra computed using a line-by-line non-LTE radiative transfer model to the HRI observations. The presence of icy grains with long lifetimes, which are pushed anti-sunward by radiation pressure, explains the observed OH asymmetry with enhancement on the night side of the coma.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS In colorectal cancer (CRC), tumour buds represent an aggressive cell type at the invasive front with apparently low proliferation. The aim of this study was to determine proliferation and apoptotic rates of buds in comparison to tumour centre, front and mucosa. METHODS AND RESULTS Whole tissue sections from 188 CRC patients underwent immunohistochemistry for Ki67. Ten high-power fields (HPFs) were evaluated in mucosa, tumour centre, tumour front and tumour buds (total = 40 HPFs/case). Caspase-3 and M30 immunohistochemistry were performed on a multipunch tissue microarray from the same cohort. Ki67, caspase-3 and M30 immunoreactivity were correlated with outcome. The average percentage of cells showing Ki67 positivity was 5.2% in mucosa, and was not significantly different between the centre and front of the tumour (38.2% and 34.9%; P < 0.0001); 0.3% of buds showed Ki67 positivity (P < 0.0001). Caspase-3 expression was similar in mucosa, tumour centre and tumour front, but lower in tumour buds (<0.1%; P < 0.0001). M30 staining in buds was decreased (0.01%; P < 0.0001) in comparison to other areas. Ki67 positivity in buds was detrimental to survival in univariate (P = 0.0352) and multivariate (P = 0.0355) analysis. Caspase-3-positive tumours showed better outcome than negative tumours (P = 0.0262); but tumours with caspase-3-positive buds showed a worse outcome than those with caspase-3-negative buds (P = 0.0235). CONCLUSIONS Ki67, caspase-3 and M30 staining is absent in most tumour buds, suggesting decreased proliferation and apoptosis. However, the fact that Ki67 and caspase-3 immunoreactivity was associated with unfavourable prognosis points to a heterogeneous population of tumour buds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water flow and solute transport through soils are strongly influenced by the spatial arrangement of soil materials with different hydraulic and chemical properties. Knowing the specific or statistical arrangement of these materials is considered as a key toward improved predictions of solute transport. Our aim was to obtain two-dimensional material maps from photographs of exposed profiles. We developed a segmentation and classification procedure and applied it to the images of a very heterogeneous sand tank, which was used for a series of flow and transport experiments. The segmentation was based on thresholds of soil color, estimated from local median gray values, and of soil texture, estimated from local coefficients of variation of gray values. Important steps were the correction of inhomogeneous illumination and reflection, and the incorporation of prior knowledge in filters used to extract the image features and to smooth the results morphologically. We could check and confirm the success of our mapping by comparing the estimated with the designed sand distribution in the tank. The resulting material map was used later as input to model flow and transport through the sand tank. Similar segmentation procedures may be applied to any high-density raster data, including photographs or spectral scans of field profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Radiological Physics Center (RPC) provides heterogeneous phantoms that are used to evaluate radiation treatment procedures as part of a comprehensive quality assurance program for institutions participating in clinical trials. It was hypothesized that the existing RPC heterogeneous thorax phantom can be modified to assess lung tumor proton beam therapy procedures involving patient simulation, treatment planning, and treatment delivery, and could confirm agreement between the measured dose and calculated dose within 5%/3mm with a reproducibility of 5%. The Hounsfield Units (HU) for lung equivalent materials (balsa wood and cork) was measured using a CT scanner. The relative linear stopping power (RLSP) of these materials was measured. The linear energy transfer (LET) of Gafchromic EBT2 film was analyzed utilizing parallel and perpendicular orientations in a water tank and compared to ion chamber readings. Both parallel and perpendicular orientations displayed a quenching effect underperforming the ion chamber, with the parallel orientation showing an average 31 % difference and the perpendicular showing an average of 15% difference. Two treatment plans were created that delivered the prescribed dose to the target volume, while achieving low entrance doses. Both treatment plans were designed using smeared compensators and expanded apertures, as would be utilized for a patient in the clinic. Plan 1a contained two beams that were set to orthogonal angles and a zero degree couch kick. Plan 1b utilized two beams set to 10 and 80 degrees with a 15 degree couch kick. EBT2 film and TLD were inserted and the phantom was irradiated 3 times for each plan. Both plans passed the criteria for the TLD measurements where the TLD values were within 7% of the dose calculated by Eclipse. Utilizing the 5%/3mm criteria, the 3 trial average of overall pass rate was 71% for Plan 1a. The 3 trial average for the overall pass rate was 76% for Plan 1b. The trials were then analyzed using RPC conventional lung treatment guidelines set forth by the RTOG: 5%/5mm, and an overall pass rate of 85%. Utilizing these criteria, only Plan 1b passed for all 3 trials, with an average overall pass rate of 89%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study the challenge of analyzing complex micro X-ray diffraction (microXRD) patterns from cement–clay interfaces has been addressed. In order to extract the maximum information concerning both the spatial distribution and the crystal structure type associated with each of the many diffracting grains in heterogeneous, polycrystalline samples, an approach has been developed in which microXRD was applied to thin sections which were rotated in the X-ray beam. The data analysis, performed on microXRD patterns collected from a filled vein of a cement–clay interface from the natural analogue in Maqarin (Jordan), and a sample from a two-year-old altered interface between cement and argillaceous rock, demonstrate the potential of this method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Dasatinib is a dual Src/Abl inhibitor recently approved for Bcr-Abl+ leukemias with resistance or intolerance to prior therapy. Because Src kinases contribute to multiple blood cell functions by triggering a variety of signaling pathways, we hypothesized that their molecular targeting might lead to growth inhibition in acute myeloid leukemia (AML). EXPERIMENTAL DESIGN: We studied growth factor-dependent and growth factor-independent leukemic cell lines, including three cell lines expressing mutants of receptor tyrosine kinases (Flt3 or c-Kit) as well as primary AML blasts for responsiveness to dasatinib. RESULTS: Dasatinib resulted in the inhibition of Src family kinases in all cell lines and blast cells at approximately 1 x 10(-9) mol/L. It also inhibited mutant Flt3 or Kit tyrosine phosphorylation at approximately 1 x 10(-6) mol/L. Mo7e cells expressing the activating mutation (codon 816) of c-Kit were most sensitive to growth inhibition with a GI(50) of 5 x 10(-9) mol/L. Primary AML blast cells exhibited a growth inhibition of <1 x>10(-6) mol/L. Cell lines that showed growth inhibition at approximately 1 x 10(-6) mol/L showed a G(1) cell cycle arrest and correlated with accumulation of p21 and p27 protein. The addition of rapamycin or cytotoxic agents enhanced growth inhibition. Dasatinib also caused the apoptosis of Mo7e cells expressing oncogenic Kit. CONCLUSIONS: Although all of the precise targets for dasatinib are not known, this multikinase inhibitor causes either growth arrest or apoptosis in molecularly heterogeneous AML. The addition of cytotoxic or targeted agents can enhance its effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is great demand for easily-accessible, user-friendly dietary self-management applications. Yet accurate, fully-automatic estimation of nutritional intake using computer vision methods remains an open research problem. One key element of this problem is the volume estimation, which can be computed from 3D models obtained using multi-view geometry. The paper presents a computational system for volume estimation based on the processing of two meal images. A 3D model of the served meal is reconstructed using the acquired images and the volume is computed from the shape. The algorithm was tested on food models (dummy foods) with known volume and on real served food. Volume accuracy was in the order of 90 %, while the total execution time was below 15 seconds per image pair. The proposed system combines simple and computational affordable methods for 3D reconstruction, remained stable throughout the experiments, operates in near real time, and places minimum constraints on users.