943 resultados para Approximate Bayesian computation, Posterior distribution, Quantile distribution, Response time data
Resumo:
Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, nonintegrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects.
Resumo:
The main goal of this dissertation was to study two- and three-nucleon Short Range Correlations (SRCs) in high energy three-body breakup of 3He nucleus in 3He(e, e'NN)N reaction. SRCs are characterized by quantum fluctuations in nuclei during which constituent nucleons partially overlap with each other. A theoretical framework is developed within the Generalized Eikonal Approximation (GEA) which upgrades existing medium-energy methods that are inapplicable for high momentum and energy transfer reactions. High momentum and energy transfer is required to provide sufficient resolution for probing SRCs. GEA is a covariant theory which is formulated through the effective Feynman diagrammatic rules. It allows self-consistent calculation of single and double re-scatterings amplitudes which are present in three-body breakup processes. The calculations were carried out in detail and the analytical result for the differential cross section of 3He(e, e'NN)Nreaction was derived in a form applicable for programming and numerical calculations. The corresponding computer code has been developed and the results of computation were compared to the published experimental data, showing satisfactory agreement for a wide range of values of missing momenta. In addition to the high energy approximation this study exploited the exclusive nature of the process under investigation to gain more information about the SRCs. The description of the exclusive 3He(e, e'NN)N reaction has been done using the formalism of the nuclear decay function, which is a practically unexplored quantity and is related to the conventional spectral function through the integration of the phase space of the recoil nucleons. Detailed investigation showed that the decay function clearly exhibits the main features of two- and three-nucleon correlations. Four highly practical types of SRCs in 3He nucleus were discussed in great detail for different orders of the final state re-interactions using the decay function as an unique identifying tool. The overall conclusion in this dissertation suggests that the investigation of the decay function opens up a completely new venue in studies of short range nuclear properties.
Resumo:
Ocean acidification and warming will be most pronounced in the Arctic Ocean. Aragonite shell-bearing pteropods in the Arctic are expected to be among the first species to suffer from ocean acidification. Carbonate undersaturation in the Arctic will first occur in winter and because this period is also characterized by low food availability, the overwintering stages of polar pteropods may develop into a bottleneck in their life cycle. The impacts of ocean acidification and warming on growth, shell degradation (dissolution), and mortality of two thecosome pteropods, the polar Limacina helicina and the boreal L. retroversa, were studied for the first time during the Arctic winter in the Kongsfjord (Svalbard). The abundance of L. helicina and L. retroversa varied from 23.5 to 120 ind /m2 and 12 to 38 ind /m2, and the mean shell size ranged from 920 to 981 µm and 810 to 823 µm, respectively. Seawater was aragonite-undersaturated at the overwintering depths of pteropods on two out of ten days of our observations. A 7-day experiment [temperature levels: 2 and 7 °C, pCO2 levels: 350, 650 (only for L. helicina) and 880 ?atm] revealed a significant pCO2 effect on shell degradation in both species, and synergistic effects between temperature and pCO2 for L. helicina. A comparison of live and dead specimens kept under the same experimental conditions indicated that both species were capable of actively reducing the impacts of acidification on shell dissolution. A higher vulnerability to increasing pCO2 and temperature during the winter season is indicated compared with a similar study from fall 2009. Considering the species winter phenology and the seasonal changes in carbonate chemistry in Arctic waters, negative climate change effects on Arctic thecosomes are likely to show up first during winter, possibly well before ocean acidification effects become detectable during the summer season.
Resumo:
This thesis is part of research on new materials for catalysis and gas sensors more active, sensitive, selective. The aim of this thesis was to develop and characterize cobalt ferrite in different morphologies, in order to study their influence on the electrical response and the catalytic activity, and to hierarchize these grains for greater diffusivity of gas in the material. The powders were produced via hydrothermal and solvothermal, and were characterized by thermogravimetric analysis, X-ray diffraction, scanning electron microscopy, transmission electron microscopy (electron diffraction, highresolution simulations), and energy dispersive spectroscopy. The catalytic and electrical properties were tested in the presence of CO and NO2 gases, the latter in different concentrations (1-100 ppm) and at different temperatures (room temperature to 350 ° C). Nanooctahedra with an average size of 20 nm were obtained by hydrothermal route. It has been determined that the shape of the grains is mainly linked to the nature of the precipitating agent and the presence of OH ions in the reaction medium. By solvothermal method CoFe2O4 spherical powders were prepared with grain size of 8 and 20 nm. CoFe2O4 powders exhibit a strong response to small amounts of NO2 (10 ppm to 200 ° C). The nanooctahedra have greater sensitivity than the spherical grains of the same size, and have smaller response time and shorter recovery times. These results were confirmed by modeling the kinetics of response and recovery of the sensor. Initial tests of catalytic activity in the oxidation of CO between temperatures of 100 °C and 350 °C show that the size effect is predominant in relation the effect of the form with respect to the conversion of the reaction. The morphology of the grains influence the rate of reaction. A higher reaction rate is obtained in the presence of nanooctahedra. In order to improve the detection and catalytic properties of the material, we have developed a methodology for hierarchizing grains which involves the use of carbonbased templates.
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Cloud computing can be defined as a distributed computational model by through resources (hardware, storage, development platforms and communication) are shared, as paid services accessible with minimal management effort and interaction. A great benefit of this model is to enable the use of various providers (e.g a multi-cloud architecture) to compose a set of services in order to obtain an optimal configuration for performance and cost. However, the multi-cloud use is precluded by the problem of cloud lock-in. The cloud lock-in is the dependency between an application and a cloud platform. It is commonly addressed by three strategies: (i) use of intermediate layer that stands to consumers of cloud services and the provider, (ii) use of standardized interfaces to access the cloud, or (iii) use of models with open specifications. This paper outlines an approach to evaluate these strategies. This approach was performed and it was found that despite the advances made by these strategies, none of them actually solves the problem of lock-in cloud. In this sense, this work proposes the use of Semantic Web to avoid cloud lock-in, where RDF models are used to specify the features of a cloud, which are managed by SPARQL queries. In this direction, this work: (i) presents an evaluation model that quantifies the problem of cloud lock-in, (ii) evaluates the cloud lock-in from three multi-cloud solutions and three cloud platforms, (iii) proposes using RDF and SPARQL on management of cloud resources, (iv) presents the cloud Query Manager (CQM), an SPARQL server that implements the proposal, and (v) comparing three multi-cloud solutions in relation to CQM on the response time and the effectiveness in the resolution of cloud lock-in.
Resumo:
Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
Salutogenesis is now accepted as a part of the contemporary model of disease: an individual is not only affected by pathogenic factors in the environment, but those that promote well-being or salutogenesis. Given that "environment" extends to include the built environment, promotion of salutogenesis has become part of the architectural brief for contemporary healthcare facilities, drawing on an increasing evidence-base. Salutogenesis is inextricably linked with the notion of person-environment "fit". MyRoom is a proposal for an integrated architectural and pervasive computing model, which enhances psychosocial congruence by using real-time data indicative of the individual's physical status to enable the environment of his/her room (colour, light, temperature) to adapt on an on-going basis in response to bio-signals. This work is part of the PRTLI-IV funded programme NEMBES, investigating the use of embedded technologies in the built environment. Different care contexts require variations in the model, and iterative prototyping investigating use in different contexts will progressively lead to the development of a fully-integrated adaptive salutogenic single-room prototype.
Resumo:
Data of amphibians, reptiles and birds surveyed from February 2016 to May 2016 in the UNESCO Sheka forest biosphere reserve are provided as an online open access data file.
Resumo:
The growth and development of the aragonitic CaCO3 otoliths of teleost fish could be vulnerable to processes resulting from ocean acidification. The potential effects of an increase in atmospheric CO2 on the calcification of the otoliths were investigated by rearing Atlantic cod Gadus morhua L. larvae in 3 pCO2 concentrations-control (370 µatm), medium (1800 µatm) and high (4200 µatm)-from March to May 2010. Increased otolith growth was observed in 7 to 46 d post hatch (dph) cod larvae at elevated pCO2 concentrations. The sagittae and lapilli were usually largest in the high pCO2 treatment followed by the medium and control treatments. The greatest difference in mean otolith surface area (normalized to fish length) was for sagittae at 11 dph, with medium and high treatments being 46 and 43% larger than the control group, respectively. There was no significant pCO2 effect on the shape of the otoliths nor were there any trends in the fluctuating asymmetry, defined as the difference between the right and left sides, in relation to the increase in otolith growth from elevated pCO2.
Resumo:
Carbon dioxide and oxygen fluxes were measured in 0.2 m2 enclosures placed at the water sediment interface in the SW lagoon of New Caledonia. Experiments, performed at several stations in a wide range of environments, were carried out both in darkness to estimate respiration and at ambient light, to assess the effects of primary production. The community respiratory quotient (CRQ = CO2 production rate/02 consumption rate) and the community photosynthetic quotient (CPQ= gross O2 production rate/gross CO2 consumption rate) were calculated by functional regressions. The CRQ value, calculated from 61 incubations, was 1.14 (S.E. 0.05) and the CPQ value, obtained from 18 incubations, was 1.03 (S.E. 0.08). The linearity of the relationship between the O2 and the CO2 fluxes suggests that these values are representative for the whole lagoon
Seawater carbonate chemistry and calcification during an experiment with a coral Porites lutea, 2004
Resumo:
Using living corals collected from Okinawan coral reefs, laboratory experiments were performed to investigate the relationship between coral calcification and aragonite saturation state (W) of seawater at 25?C. Calcification rate of a massive coral Porites lutea cultured in a beaker showed a linear increase with increasing Waragonite values (1.08-7.77) of seawater. The increasing trend of calcification rate (c) for W is expressed as an equation, c = aW + b (a, b: constants). When W was larger than ~4, the coral samples calcified during nighttime, indicating an evidence of dark calcification. This study strongly suggests that calcification of Porites lutea depends on W of ambient seawater. A decrease in saturation state of seawater due to increased pCO2 may decrease reef-building capacity of corals through reducing calcification rate of corals.
Resumo:
BACKGROUND: Proteins belonging to the serine protease inhibitor (serpin) superfamily play essential physiological roles in many organisms. In pathogens, serpins are thought to have evolved specifically to limit host immune responses by interfering with the host immune-stimulatory signals. Serpins are less well characterised in parasitic helminths, although some are thought to be involved in mechanisms associated with host immune modulation. In this study, we cloned and partially characterised a secretory serpin from Schistosoma japonicum termed SjB6, these findings provide the basis for possible functional roles.
METHODS: SjB6 gene was identified through database mining of our previously published microarray data, cloned and detailed sequence and structural analysis and comparative modelling carried out using various bioinformatics and proteomics tools. Gene transcriptional profiling was determined by real-time PCR and the expression of native protein determined by immunoblotting. An immunological profile of the recombinant protein produced in insect cells was determined by ELISA.
RESULTS: SjB6 contains an open reading frame of 1160 base pairs that encodes a protein of 387 amino acid residues. Detailed sequence analysis, comparative modelling and structural-based alignment revealed that SjB6 contains the essential structural motifs and consensus secondary structures typical of inhibitory serpins. The presence of an N-terminal signal sequence indicated that SjB6 is a secretory protein. Real-time data indicated that SjB6 is expressed exclusively in the intra-mammalian stage of the parasite life cycle with its highest expression levels in the egg stage (p < 0.0001). The native protein is approximately 60 kDa in size and recombinant SjB6 (rSjB6) was recognised strongly by sera from rats experimentally infected with S. japonicum.
CONCLUSIONS: The significantly high expression of SjB6 in schistosome eggs, when compared to other life cycle stages, suggests a possible association with disease pathology, while the strong reactivity of sera from experimentally infected rats against rSjB6 suggests that native SjB6 is released into host tissue and induces an immune response. This study presents a comprehensive demonstration of sequence and structural-based analysis of a secretory serpin from a trematode and suggests SjB6 may be associated with important functional roles in S. japonicum, particularly in parasite modulation of the host microenvironment.