114 resultados para granularity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the issue of activity understanding from video and its semantics-rich description. A novel approach is presented where activities are characterised and analysed at different resolutions. Semantic information is delivered according to the resolution at which the activity is observed. Furthermore, the multiresolution activity characterisation is exploited to detect abnormal activity. To achieve these system capabilities, the focus is given on context modelling by employing a soft computing-based algorithm which automatically enables the determination of the main activity zones of the observed scene by taking as input the trajectories of detected mobiles. Such areas are learnt at different resolutions (or granularities). In a second stage, learned zones are employed to extract people activities by relating mobile trajectories to the learned zones. In this way, the activity of a person can be summarised as the series of zones that the person has visited. Employing the inherent soft relation properties, the reported activities can be labelled with meaningful semantics. Depending on the granularity at which activity zones and mobile trajectories are considered, the semantic meaning of the activity shifts from broad interpretation to detailed description.Activity information at different resolutions is also employed to perform abnormal activity detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims to present, using a set of guidelines, how to apply the conservative distributed simulation paradigm (CMB protocol) to develop efficient applications. Using these guidelines, even a user with little experience on distributed simulation and computer architecture can have good performance on distributed simulations using conservative synchronization protocols for parallel processes.The set of guidelines is focus on a specific application domain, the performance evaluation of computer systems, considering models with coarse granularity and few logical processes and running over two platforms: parallel (high performance communication environment) and distributed (low performance communication environment).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study carried out in the environment of Maracajaú reef an São Roque channel, had as main objective to analyze the characteristics of sediments active locally expressed in the grains, through collections of sediments in the field, technical processing and data analyzes of sediments. Data processing were made on three main aspects: biotic composition, concentration of calcium carbonate and particle size of the sediment. Differences between the sediments of the reefs and channel were observed. It was emphasized the contribution of algae limestone in the production of carbonate, with some influence of foraminifera near the coast. The particle size distribution presented significant results for the understanding of locally sedimentary deposits. The results showed an environment of carbonate, with predominance of algae limestone, associated to unconsolidated sediments with gross granularity, besides the presence of rhodoliths in all samples.The fragmentation of biotic components and the prevalence of elliptical rhodoliths with little or no branch, indicate an environment of high energy hydrodynamics. This work is a further contribution to the understanding of sedimentology active locally in reef environments, in particular the of Maracajaú reef, by virtue of their complex ecosystem composed of a diversity of wild fauna and flora that still little studied in Brazil comparing to accelerated growth of teeth extractions and usufructs of natural resources causing often irreversible impacts to the environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Schistosomiasis is one of the most important parasitic infections in humans that occur in many tropical and subtropical countries. Currently, the control of schistosomiasis rests with a single drug, praziquantel, which is effective against adult worms but not the larval stages. Recent studies have shown that piplartine, an amide isolated from plants of the genus Piper (Piperaceae), reveals interesting antischistosomal properties against Schistosoma mansoni adult worms. Here, we report the in vitro antischistosomal activity of piplartine on S. mansoni schistosomula of different ages (3 h old and 1, 3, 5, and 7 days old), and examine alterations on the tegumental surface of worms by means of confocal laser scanning microscopy. Piplartine at a concentration of 7.5 mu M caused the death of all schistosomula within 120 h. The lethal effect occurred in a dose-dependent manner and was also dependent on the age of the parasite. Microscopy observation revealed extensive tegumental destruction, including blebbing, granularity, and a shorter body length. This report provides the first evidence that piplartine is able to kill schistosomula of different ages and reinforce that piplartine is a promising compound that could be used for the development of new schistosomicidal agent. (c) 2012 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Complex Networks analysis turn out to be a very promising field of research, testified by many research projects and works that span different fields. Those analysis have been usually focused on characterize a single aspect of the system and a study that considers many informative axes along with a network evolve is lacking. We propose a new multidimensional analysis that is able to inspect networks in the two most important dimensions, space and time. To achieve this goal, we studied them singularly and investigated how the variation of the constituting parameters drives changes to the network as a whole. By focusing on space dimension, we characterized spatial alteration in terms of abstraction levels. We proposed a novel algorithm that, by applying a fuzziness function, can reconstruct networks under different level of details. We verified that statistical indicators depend strongly on the granularity with which a system is described and on the class of networks. We keep fixed the space axes and we isolated the dynamics behind networks evolution process. We detected new instincts that trigger social networks utilization and spread the adoption of novel communities. We formalized this enhanced social network evolution by adopting special nodes (called sirens) that, thanks to their ability to attract new links, were able to construct efficient connection patterns. We simulated the dynamics of the system by considering three well-known growth models. Applying this framework to real and synthetic networks, we showed that the sirens, even when used for a limited time span, effectively shrink the time needed to get a network in mature state. In order to provide a concrete context of our findings, we formalized the cost of setting up such enhancement and provided the best combinations of system's parameters, such as number of sirens, time span of utilization and attractiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermal effects are rapidly gaining importance in nanometer heterogeneous integrated systems. Increased power density, coupled with spatio-temporal variability of chip workload, cause lateral and vertical temperature non-uniformities (variations) in the chip structure. The assumption of an uniform temperature for a large circuit leads to inaccurate determination of key design parameters. To improve design quality, we need precise estimation of temperature at detailed spatial resolution which is very computationally intensive. Consequently, thermal analysis of the designs needs to be done at multiple levels of granularity. To further investigate the flow of chip/package thermal analysis we exploit the Intel Single Chip Cloud Computer (SCC) and propose a methodology for calibration of SCC on-die temperature sensors. We also develop an infrastructure for online monitoring of SCC temperature sensor readings and SCC power consumption. Having the thermal simulation tool in hand, we propose MiMAPT, an approach for analyzing delay, power and temperature in digital integrated circuits. MiMAPT integrates seamlessly into industrial Front-end and Back-end chip design flows. It accounts for temperature non-uniformities and self-heating while performing analysis. Furthermore, we extend the temperature variation aware analysis of designs to 3D MPSoCs with Wide-I/O DRAM. We improve the DRAM refresh power by considering the lateral and vertical temperature variations in the 3D structure and adapting the per-DRAM-bank refresh period accordingly. We develop an advanced virtual platform which models the performance, power, and thermal behavior of a 3D-integrated MPSoC with Wide-I/O DRAMs in detail. Moving towards real-world multi-core heterogeneous SoC designs, a reconfigurable heterogeneous platform (ZYNQ) is exploited to further study the performance and energy efficiency of various CPU-accelerator data sharing methods in heterogeneous hardware architectures. A complete hardware accelerator featuring clusters of OpenRISC CPUs, with dynamic address remapping capability is built and verified on a real hardware.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to investigate on some molecular mechanisms contributing to the pathogenesis of osteoarthritis (OA) and in particular to the senescence of articular chondrocytes. It is focused on understanding molecular events downstream GSK3β inactivation or dependent on the activity of IKKα, a kinase that does not belong to the phenotype of healthy articular chondrocytes. Moreover, the potential of some nutraceuticals on scavenging ROS thus reducing oxidative stress, DNA damage, and chondrocyte senescence has been evaluated in vitro. The in vitro LiCl-mediated GSK3β inactivation resulted in increased mitochondrial ROS production, that impacted on cellular proliferation, with S-phase transient arrest, increased SA-β gal and PAS staining, cell size and granularity. ROS are also responsible for the of increased expression of two major oxidative lesions, i.e. 1) double strand breaks, tagged by γH2AX, that associates with activation of GADD45β and p21, and 2) 8-oxo-dG adducts, that associate with increased IKKα and MMP-10 expression. The pattern observed in vitro was confirmed on cartilage from OA patients. IKKa dramatically affects the intensity of the DNA damage response induced by oxidative stress (H2O2 exposure) in chondrocytes, as evidenced by silencing strategies. At early time point an higher percentage of γH2AX positive cells and more foci in IKKa-KD cells are observed, but IKKa KD cells proved to almost completely recover after 24 hours respect to their controls. Telomere attrition is also reduced in IKKaKD. Finally MSH6 and MLH1 genes are up-regulated in IKKαKD cells but not in control cells. Hydroxytyrosol and Spermidine have a great ROS scavenging capacity in vitro. Both treatments revert the H2O2 dependent increase of cell death and γH2AX-foci formation and senescence, suggesting the ability of increasing cell homeostasis. These data indicate that nutraceuticals represent a great challenge in OA management, for both therapeutical and preventive purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The upgrade of the Mainz Mikrotron (MAMI) electron accelerator facility in 2007 which raised the beam energy up to 1.5,GeV, gives the opportunity to study strangeness production channels through electromagnetic process. The Kaon Spectrometer (KAOS) managed by the A1 Collaboration, enables the efficient detection of the kaons associated with strangeness electroproduction. Used as a single arm spectrometer, it can be combined with the existing high-resolution spectrometers for exclusive measurements in the kinematic domain accessible to them.rnrnFor studying hypernuclear production in the ^A Z(e,e'K^+) _Lambda ^A(Z-1) reaction, the detection of electrons at very forward angles is needed. Therefore, the use of KAOS as a double-arm spectrometer for detection of kaons and the electrons at the same time is mandatory. Thus, the electron arm should be provided with a new detector package, with high counting rate capability and high granularity for a good spatial resolution. To this end, a new state-of-the-art scintillating fiber hodoscope has been developed as an electron detector.rnrnThe hodoscope is made of two planes with a total of 18432 scintillating double-clad fibers of 0.83 mm diameter. Each plane is formed by 72 modules. Each module is formed from a 60deg slanted multi-layer bundle, where 4 fibers of a tilted column are connected to a common read out. The read-out is made with 32 channels of linear array multianode photomultipliers. Signal processing makes use of newly developed double-threshold discriminators. The discriminated signal is sent in parallel to dead-time free time-to-digital modules and to logic modules for triggering purposes.rnrnTwo fiber modules were tested with a carbon beam at GSI, showing a time resolution of 220 ps (FWHM) and a position residual of 270 microm m (FWHM) with a detection efficiency epsilon>99%.rnrnThe characterization of the spectrometer arm has been achieved through simulations calculating the transfer matrix of track parameters from the fiber detector focal plane to the primary vertex. This transfer matrix has been calculated to first order using beam transport optics and has been checked by quasielastic scattering off a carbon target, where the full kinematics is determined by measuring the recoil proton momentum. The reconstruction accuracy for the emission parameters at the quasielastic vertex was found to be on the order of 0.3 % in first test realized.rnrnThe design, construction process, commissioning, testing and characterization of the fiber hodoscope are presented in this work which has been developed at the Institut für Kernphysik of the Johannes Gutenberg - Universität Mainz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oncocytomas are defined as tumors containing in excess of 50% large mitochondrion-rich cells, irrespective of histogenesis and dignity. Along the central neuraxis, oncocytomas are distinctly uncommon but relevant to the differential diagnosis of neoplasia marked by prominent cytoplasmic granularity. We describe an anaplastic ependymoma (WHO grade III) with a prevailing oncocytic component that was surgically resected from the right fronto-insular region of a 43-year-old female. Preoperative imaging showed a fairly circumscribed, partly cystic, contrast-enhancing mass of 2 cm × 2 cm × 1.7 cm. Histology revealed a biphasic neoplasm wherein conventional ependymal features coexisted with plump epithelioid cells replete with brightly eosinophilic granules. Whereas both components displayed an overtly ependymal immunophenotype, including positivity for S100 protein and GFAP, as well as "dot-like" staining for EMA, the oncocytic population also tended to intensely react with the antimitochondrial antibody 113-1. Conversely, failure to bind CD68 indicated absence of significant lysosomal storage. Negative reactions for both pan-cytokeratin (MNF 116) and low molecular weight cytokeratin (CAM 5.2), as well as synaptophysin and thyroglobulin, further assisted in ruling out metastatic carcinoma. In addition to confirming the presence of "zipper-like" intercellular junctions and microvillus-bearing cytoplasmic microlumina, electron microscopy allowed for the pervasive accumulation of mitochondria in tumor cells to be directly visualized. A previously not documented variant, oncocytic ependymoma, is felt to add a reasonably relevant novel item to the differential diagnosis of granule-bearing central nervous system neoplasia, in particular oncocytic meningioma, granular cell astrocytoma, as well as metastatic deposits by oncocytic malignancies from extracranial sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software must be constantly adapted to changing requirements. The time scale, abstraction level and granularity of adaptations may vary from short-term, fine-grained adaptation to long-term, coarse-grained evolution. Fine-grained, dynamic and context-dependent adaptations can be particularly difficult to realize in long-lived, large-scale software systems. We argue that, in order to effectively and efficiently deploy such changes, adaptive applications must be built on an infrastructure that is not just model-driven, but is both model-centric and context-aware. Specifically, this means that high-level, causally-connected models of the application and the software infrastructure itself should be available at run-time, and that changes may need to be scoped to the run-time execution context. We first review the dimensions of software adaptation and evolution, and then we show how model-centric design can address the adaptation needs of a variety of applications that span these dimensions. We demonstrate through concrete examples how model-centric and context-aware designs work at the level of application interface, programming language and runtime. We then propose a research agenda for a model-centric development environment that supports dynamic software adaptation and evolution.