923 resultados para Extensions
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
We provide a detailed account of the spatial structure of the Brazilian sardine (Sardinella brasiliensis) spawning and nursery habitats, using ichthyoplankton data from nine surveys (1976-1993) covering the Southeastern Brazilian Bight (SBB). The spatial variability of sardine eggs and larvae was partitioned into predefined spatial-scale classes (broad scale, 200-500 km; medium scale, 50-100 km; and local scale, <50 km). The relationship between density distributions at both developmental stages and environmental descriptors (temperature and salinity) was also explored within these spatial scales. Spatial distributions of sardine eggs were mostly structured on medium and local scales, while larvae were characterized by broad-and medium-scale distributions. Broad-and medium-scale surface temperatures were positively correlated with sardine densities, for both developmental stages. Correlations with salinity were predominantly negative and concentrated on a medium scale. Broad-scale structuring might be explained by mesoscale processes, such as pulsing upwelling events and Brazil Current meandering at the northern portion of the SBB, while medium-scale relationships may be associated with local estuarine outflows. The results indicate that processes favouring vertical stability might regulate the spatial extensions of suitable spawning and nursery habitats for the Brazilian sardine.
Resumo:
The aim of this study was to evaluate the microbial growth on single-use vitrectomy probes reprocessed in healthcare practice. We investigated nine vitrectomy probes that had been reused and reprocessed using different methods. The samples were sectioned, individually, in portions of 3.5 cm, totaling 979 sampling units (extensions, connectors and vitrectomy cutters), which were inoculated in culture medium and incubated at 37 C for 14 days. The results showed microbial growth on 57 (5.8%) sample units, 25 of which had been sterilized using ethylene oxide, 16 by hydrogen peroxide plasma, and 16 by low-temperature steam and formaldehyde. Seventeen microbial species were identified. The most prevalent were: Micrococcus spp., coagulase-negative Staphylococcus, Pseudomonas spp., and Bacillus subtilis. The reuse of single-use vitrectomy probes was shown to be unsafe, therefore this practice is not recommended.
Resumo:
PURPOSE. We previously demonstrated that most eyes have regionally variable extensions of Bruch's membrane (BM) inside the clinically identified disc margin (DM) that are clinically and photographically invisible. We studied the impact of these findings on DM- and BM opening (BMO)-derived neuroretinal rim parameters. METHODS. Disc stereo-photography and spectral domain optical coherence tomography (SD-OCT, 24 radial B-scans centered on the optic nerve head) were performed on 30 glaucoma patients and 10 age-matched controls. Photographs were colocalized to SD-OCT data such that the DM and BMO could be visualized in each B-scan. Three parameters were computed: (1) DM-horizontal rim width (HRW), the distance between the DM and internal limiting membrane (ILM) along the DM reference plane; (2) BMO-HRW, the distance between BMO and ILM along the BMO reference plane; and (3) BMO-minimum rim width (MRW), the minimum distance between BMO and ILM. Rank-order correlations of sectors ranked by rim width and spatial concordance measured as angular distances between equivalently ranked sectors were derived. RESULTS. The average DM position was external to BMO in all quadrants, except inferotemporally. There were significant sectoral differences among all three rim parameters. DM- HRW and BMO-HRW sector ranks were better correlated (median rho = 0.84) than DM- HRW and BMO-MRW (median rho = 0.55), or BMO-HRW and BMO-MRW (median rho = 0.60) ranks. Sectors with the narrowest BMO-MRW were infrequently the same as those with the narrowest DM-HRW or BMO-HRW. CONCLUSIONS. BMO-MRW quantifies the neuroretinal rim from a true anatomical outer border and accounts for its variable trajectory at the point of measurement. (Invest Ophthalmol Vis Sci. 2012;53:1852-1860) DOI:10.1167/iovs.11-9309
Resumo:
Several extensions of the standard model predict the existence of new neutral spin-1 resonances associated with the electroweak symmetry breaking sector. Using the data from ATLAS (with integrated luminosity of L = 1.02 fb(-1)) and CMS (with integrated luminosity of L = 1.55 fb(-1)) on the production of W+W- pairs through the process pp --> l(+)l(-)' is not an element of(T), we place model independent bounds on these new vector resonances masses, couplings, and widths. Our analyses show that the present data exclude new neutral vector resonances with masses up to 1-2.3 TeV depending on their couplings and widths. We also demonstrate how to extend our analysis framework to different models with a specific example.
Resumo:
We analyse the interplay between the Higgs to diphoton rate and electroweak precision measurements constraints in extensions of the Standard Model with new uncolored charged fermions that do not mix with the ordinary ones. We also compute the pair production cross sections for the lightest fermion and compare them with current bounds.
Resumo:
The forest-like characteristics of agroforestry systems create a unique opportunity to combine agricultural production with biodiversity conservation in human-modified tropical landscapes. The cacao-growing region in southern Bahia, Brazil, encompasses Atlantic forest remnants and large extensions of agroforests, locally known as cabrucas, and harbors several endemic large mammals. Based on the differences between cabrucas and forests, we hypothesized that: (1) non-native and non-arboreal mammals are more frequent, whereas exclusively arboreal and hunted mammals are less frequent in cabrucas than forests; (2) the two systems differ in mammal assemblage structure, but not in species richness; and (3) mammal assemblage structure is more variable among cabrucas than forests. We used camera-traps to sample mammals in nine pairs of cabruca-forest sites. The high conservation value of agroforests was supported by the presence of species of conservation concern in cabrucas, and similar species richness and composition between forests and cabrucas. Arboreal species were less frequently recorded, however, and a non-native and a terrestrial species adapted to open environments (Cerdocyon thous) were more frequently recorded in cabrucas. Factors that may overestimate the conservation value of cabrucas are: the high proportion of total forest cover in the study landscape, the impoverishment of large mammal fauna in forest, and uncertainty about the long-term maintenance of agroforestry systems. Our results highlight the importance of agroforests and forest remnants for providing connectivity in human-modified tropical forest landscapes, and the importance of controlling hunting and dogs to increase the value of agroforestry mosaics.
Resumo:
Among the soils in the Mato Grosso do Sul, stand out in the Pantanal biome, the Spodosols. Despite being recorded in considerable extensions, few studies aiming to characterize and classify these soils were performed. The purpose of this study was to characterize and classify soils in three areas of two physiographic types in the Taquari river basin: bay and flooded fields. Two trenches were opened in the bay area (P1 and P2) and two in the flooded field (P3 and P4). The third area (saline) with high sodium levels was sampled for further studies. In the soils in both areas the sand fraction was predominant and the texture from sand to sandy loam, with the main constituent quartz. In the bay area, the soil organic carbon in the surface layer (P1) was (OC) > 80 g kg(-1), being diagnosed as Histic epipedon. In the other profiles the surface horizons had low OC levels which, associated with other properties, classified them as Ochric epipedons. In the soils of the bay area (P1 and P2), the pH ranged from 5.0 to 7.5, associated with dominance of Ca2+ and Mg2+, with base saturation above 50 % in some horizons. In the flooded fields (P3 and P4) the soil pH ranged from 4.9 to 5.9, H+ contents were high in the surface horizons (0.8-10.5 cmol(c) kg(-1)), Ca2+ and Mg-2 contents ranged from 0.4 to 0.8 cmol(c) kg(-1) and base saturation was < 50 %. In the soils of the bay area (P1 and P2) iron was accumulated (extracted by dithionite - Fed) and OC in the spodic horizon; in the P3 and P4 soils only Fed was accumulated (in the subsurface layers). According to the criteria adopted by the Brazilian System of Soil Classification (SiBCS) at the subgroup level, the soils were classified as: P1: Organic Hydromorphic Ferrohumiluvic Spodosol. P2: Typical Orthic Ferrohumiluvic Spodosol. P3: Typical Hydromorphic Ferroluvic Spodosol. P4: Arenic Orthic Ferroluvic Spodosol.
Resumo:
The aim of this study was to analyze the rat temporomandibular joint (TMJ) synovial membrane at different ages using light, scanning, and transmission electron microscopy. Under light microscopic analysis, the TMJ structures were observed such as condyle, capsule, disk, the synovial membrane collagen type, and cells distribution. In the scanning electron microscopy, the synovial membrane surface exhibited a smooth aspect in young animals and there was an increase with ageing in the number of folds. The transmission electron microscopic analysis showed more synoviocytes in the synovial layer in the young group and still a great number of vesicles and cisterns dilation of rough endoplasmic reticulum in the aged group. In the three groups, a dense layer of collagen fibers in the synovial layer and cytoplasmic extensions were clearly seen. It was possible to conclude that synovial membrane structures in aged group showed alterations contributing to the decrease in joint lubrication and in the sliding between disk and joint surfaces. These characteristic will reflect in biomechanics of chewing, and may cause the TMJ disorders, currently observed in clinical processes. Microsc. Res. Tech. (c) 2012 Wiley Periodicals, Inc.
Resumo:
Among the soils in the Mato Grosso do Sul, stand out in the Pantanal biome, the Spodosols. Despite being recorded in considerable extensions, few studies aiming to characterize and classify these soils were performed. The purpose of this study was to characterize and classify soils in three areas of two physiographic types in the Taquari river basin: bay and flooded fields. Two trenches were opened in the bay area (P1 and P2) and two in the flooded field (P3 and P4). The third area (saline) with high sodium levels was sampled for further studies. In the soils in both areas the sand fraction was predominant and the texture from sand to sandy loam, with the main constituent quartz. In the bay area, the soil organic carbon in the surface layer (P1) was (OC) > 80 g kg-1, being diagnosed as Histic epipedon. In the other profiles the surface horizons had low OC levels which, associated with other properties, classified them as Ochric epipedons. In the soils of the bay area (P1 and P2), the pH ranged from 5.0 to 7.5, associated with dominance of Ca2+ and Mg2+, with base saturation above 50 % in some horizons. In the flooded fields (P3 and P4) the soil pH ranged from 4.9 to 5.9, H+ contents were high in the surface horizons (0.8-10.5 cmol c kg-1 ), Ca2+ and Mg² contents ranged from 0.4 to 0.8 cmol c kg-1 and base saturation was < 50 %. In the soils of the bay area (P1 and P2) iron was accumulated (extracted by dithionite - Fed) and OC in the spodic horizon; in the P3 and P4 soils only Fed was accumulated (in the subsurface layers). According to the criteria adopted by the Brazilian System of Soil Classification (SiBCS) at the subgroup level, the soils were classified as: P1: Organic Hydromorphic Ferrohumiluvic Spodosol. P2: Typical Orthic Ferrohumiluvic Spodosol. P3: Typical Hydromorphic Ferroluvic Spodosol. P4: Arenic Orthic Ferroluvic Spodosol.
Resumo:
The thesis consists of three independent parts. Part I: Polynomial amoebas We study the amoeba of a polynomial, as de ned by Gelfand, Kapranov and Zelevinsky. A central role in the treatment is played by a certain convex function which is linear in each complement component of the amoeba, which we call the Ronkin function. This function is used in two di erent ways. First, we use it to construct a polyhedral complex, which we call a spine, approximating the amoeba. Second, the Monge-Ampere measure of the Ronkin function has interesting properties which we explore. This measure can be used to derive an upper bound on the area of an amoeba in two dimensions. We also obtain results on the number of complement components of an amoeba, and consider possible extensions of the theory to varieties of codimension higher than 1. Part II: Differential equations in the complex plane We consider polynomials in one complex variable arising as eigenfunctions of certain differential operators, and obtain results on the distribution of their zeros. We show that in the limit when the degree of the polynomial approaches innity, its zeros are distributed according to a certain probability measure. This measure has its support on the union of nitely many curve segments, and can be characterized by a simple condition on its Cauchy transform. Part III: Radon transforms and tomography This part is concerned with different weighted Radon transforms in two dimensions, in particular the problem of inverting such transforms. We obtain stability results of this inverse problem for rather general classes of weights, including weights of attenuation type with data acquisition limited to a 180 degrees range of angles. We also derive an inversion formula for the exponential Radon transform, with the same restriction on the angle.
Resumo:
The nature of the dark matter in the Universe is one of the greatest mysteries in modern astronomy. The neutralino is a nonbaryonic dark matter candidate in minimal supersymmetric extensions to the standard model of particle physics. If the dark matter halo of our galaxy is made up of neutralinos some would become gravitationally trapped inside massive bodies like the Earth. Their pair-wise annihilation produces neutrinos that can be detected by neutrino experiments looking in the direction of the centre of the Earth. The AMANDA neutrino telescope, currently the largest in the world, consists of an array of light detectors buried deep in the Antarctic glacier at the geographical South Pole. The extremely transparent ice acts as a Cherenkov medium for muons passing the array and using the timing information of detected photons it is possible to reconstruct the muon direction. A search has been performed for nearly vertically upgoing neutrino induced muons with AMANDA-B10 data taken over the three year period 1997-99. No excess above the atmospheric neutrino background expectation was found. Upper limits at the 90 % confidence level has been set on the annihilation rate of neutralinos at the centre of the Earth and on the muon flux induced by neutrinos created by the annihilation products.
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
The sustained demand for faster,more powerful chips has beenmet by the availability of chip manufacturing processes allowing for the integration of increasing numbers of computation units onto a single die. The resulting outcome, especially in the embedded domain, has often been called SYSTEM-ON-CHIP (SOC) or MULTI-PROCESSOR SYSTEM-ON-CHIP (MPSOC). MPSoC design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. NETWORKS-ON-CHIPS (NOCS) are the most comprehensive and scalable answer to this design concern. By bringing large-scale networking concepts to the on-chip domain, they guarantee a structured answer to present and future communication requirements. The point-to-point connection and packet switching paradigms they involve are also of great help in minimizing wiring overhead and physical routing issues. However, as with any technology of recent inception, NoC design is still an evolving discipline. Several main areas of interest require deep investigation for NoCs to become viable solutions: • The design of the NoC architecture needs to strike the best tradeoff among performance, features and the tight area and power constraints of the on-chip domain. • Simulation and verification infrastructure must be put in place to explore, validate and optimize the NoC performance. • NoCs offer a huge design space, thanks to their extreme customizability in terms of topology and architectural parameters. Design tools are needed to prune this space and pick the best solutions. • Even more so given their global, distributed nature, it is essential to evaluate the physical implementation of NoCs to evaluate their suitability for next-generation designs and their area and power costs. This dissertation focuses on all of the above points, by describing a NoC architectural implementation called ×pipes; a NoC simulation environment within a cycle-accurate MPSoC emulator called MPARM; a NoC design flow consisting of a front-end tool for optimal NoC instantiation, called SunFloor, and a set of back-end facilities for the study of NoC physical implementations. This dissertation proves the viability of NoCs for current and upcoming designs, by outlining their advantages (alongwith a fewtradeoffs) and by providing a full NoC implementation framework. It also presents some examples of additional extensions of NoCs, allowing e.g. for increased fault tolerance, and outlines where NoCsmay find further application scenarios, such as in stacked chips.