959 resultados para Complexity analysis
Resumo:
One of the biggest challenges for humanity is global warming and consequently, climate changes. Even though there has been increasing public awareness and investments from numerous countries concerning renewable energies, fossil fuels are and will continue to be in the near future, the main source of energy. Carbon capture and storage (CCS) is believed to be a serious measure to mitigate CO2 concentration. CCS briefly consists of capturing CO2 from the atmosphere or stationary emission sources and transporting and storing it via mineral carbonation, in oceans or geological media. The latter is referred to as carbon capture and geological storage (CCGS) and is considered to be the most promising of all solutions. Generally it consists of a storage (e.g. depleted oil reservoirs and deep saline aquifers) and sealing (commonly termed caprock in the oil industry) formations. The present study concerns the injection of CO2 into deep aquifers and regardless injection conditions, temperature gradients between carbon dioxide and the storage formation are likely to occur. Should the CO2 temperature be lower than the storage formation, a contractive behaviour of the reservoir and caprock is expected. The latter can result in the opening of new paths or re-opening of fractures, favouring leakage and compromising the CCGS project. During CO2 injection, coupled thermo-hydro-mechanical phenomena occur, which due to their complexity, hamper the assessment of each relative influence. For this purpose, several analyses were carried out in order to evaluate their influences but focusing on the thermal contractive behaviour. It was finally concluded that depending on mechanical and thermal properties of the pair aquifer-seal, the sealing caprock can undergo significant decreases in effective stress.
Resumo:
This thesis applied real options analysis to the valuation of an offshore oil exploration project, taking into consideration the several options typically faced by the management team of these projects. The real options process is developed under technical and price uncertainties, where it is considered that the mean reversion stochastic process is more adequate to describe the movement of oil price throught time. The valuation is realized to two case scenarios, being the first a simplified approach to develop the intuition of the used concepts, and the later a more complete cases that is resolved using both the binomial and trinomial processes to describe oil price movement. Real options methodology demonstrated to be capable of assessing and valuing the projects options, and of overcoming common capital budgeting methodologies flexibility limitation. The added value of the application of real options is evident, but so is the method's increased complexity, which adversely influence its widespread implementation.
Resumo:
Work-related musculoskeletal disorders (WMSD) became one of the biggest health problems in the workplace and one of the main concerns of ergonomics and despite all the technical improvements manual handling is still an important risk factor for WMSD. The current study was performed with the main objective of conducting an ergonomic analysis in a workplace that consists in packaging products in a pallet, in a food distribution industry, also called picking. In this perspective, the aim of the study is to identify if the tasks performed by operators present any risk of WMSD and, if so, to suggest proposals for minimizing the associated effort. The methodologies of ergonomic risk assessment that were initially applied were the Risk Reckoner and the Manual Handling Assessment Chart (MAC). Subsequently, in order to, on the one hand, complement the analysis performed using the two methods previously mentioned, and, on the other hand, allow an assessment of two important risk factors associated with this activity (work postures and loads handling), two additional methodologies were also selected: the Revised NIOSH Lifting Equation and the Rapid Entire Body Assessment (REBA). In all the performed approaches, the tasks of palletizing at lower levels were identified as the ones that most penalize workers in what regards the risk of development of WMSD. All methodologies identified levels of risk that require an immediate or short-term ergonomic intervention, aiming at ensuring the safety and health of workers performing such activity. The implementation of measures designed to eliminate or minimize the risk may involve the allocation of significant human and material resources that is increasingly necessary to manage efficiently. Taking into account the complexity and variability of the developed tasks, it is recommended that such a decision can be preceded by a new study using more accurate risk assessment methodologies, such as those that use monitoring tools.
Resumo:
Radiometric changes observed in multi-temporal optical satellite images have an important role in efforts to characterize selective-logging areas. The aim of this study was to analyze the multi-temporal behavior of spectral-mixture responses in satellite images in simulated selective-logging areas in the Amazon forest, considering red/near-infrared spectral relationships. Forest edges were used to infer the selective-logging infrastructure using differently oriented edges in the transition between forest and deforested areas in satellite images. TM/Landsat-5 images acquired at three dates with different solar-illumination geometries were used in this analysis. The method assumed that the radiometric responses between forest with selective-logging effects and forest edges in contact with recent clear-cuts are related. The spatial frequency attributes of red/near infrared bands for edge areas were analyzed. Analysis of dispersion diagrams showed two groups of pixels that represent selective-logging areas. The attributes for size and radiometric distance representing these two groups were related to solar-elevation angle. The results suggest that detection of timber exploitation areas is limited because of the complexity of the selective-logging radiometric response. Thus, the accuracy of detecting selective logging can be influenced by the solar-elevation angle at the time of image acquisition. We conclude that images with lower solar-elevation angles are less reliable for delineation of selecting logging.
Resumo:
This study compared tidepool fish assemblages within and among habitats at Iparana and Pecém beaches, State of Ceará, Northeast Brazil, using visual census techniques. A total of 8,914 fishes, representing 25 families and 43 species were recorded. The most abundant taxon was Sparisoma spp, followed by Haemulon parra (Desmarest, 1823), Acanthurus chirurgus (Bloch, 1787) and Abudefduf saxatilis (Linnaeus, 1758). Haemulidae was the most abundant family in number of individuals, followed by Scaridae, Acanthuridae and Pomacentridae. Within- and between- site differences in species assemblages probably reflected environmental discontinuities and more localized features, such as pool isolation episodes, or environmental complexity, both acting isolated or interactively. The locality of Iparana was probably subjected to a greater fishing pressure and tourism than Pecém, a potential cause for the observed lowest fish abundance and biodiversity. We conclude that tidepool ichthyofauna may be quite variable between and within reef sites. Thus, observations taken from or damages caused on one area may not be generalized to or mitigated by the protection of adjacent sites.
Resumo:
In general terms key sectors analysis aims at identifying the role, or impact, that the existence of a productive sector has in the economy. Quite a few measures, indicators and methodologies of varied complexity have been proposed in the literature, from multiplier sums to extraction methods, but not without debate about their properties and their information content. All of them, to our knowledge, focus exclusively on the interdependence effects that result from the input-output structure of the economy. By so doing the simple input-output approach misses critical links beyond the interindustry ones. A productive sector’s role is that of producing but also that of generating and distributing income among primary factors as a result of production. Thus when measuring a sector’s role, the income generating process cannot and should not be omitted if we want to better elucidate the sector’ economic role. A simple way to make the missing income link explicit is to use the SAM (Soci
Resumo:
Neuroblastoma (NB) is a neural crest-derived childhood tumor characterized by a remarkable phenotypic diversity, ranging from spontaneous regression to fatal metastatic disease. Although the cancer stem cell (CSC) model provides a trail to characterize the cells responsible for tumor onset, the NB tumor-initiating cell (TIC) has not been identified. In this study, the relevance of the CSC model in NB was investigated by taking advantage of typical functional stem cell characteristics. A predictive association was established between self-renewal, as assessed by serial sphere formation, and clinical aggressiveness in primary tumors. Moreover, cell subsets gradually selected during serial sphere culture harbored increased in vivo tumorigenicity, only highlighted in an orthotopic microenvironment. A microarray time course analysis of serial spheres passages from metastatic cells allowed us to specifically "profile" the NB stem cell-like phenotype and to identify CD133, ABC transporter, and WNT and NOTCH genes as spheres markers. On the basis of combined sphere markers expression, at least two distinct tumorigenic cell subpopulations were identified, also shown to preexist in primary NB. However, sphere markers-mediated cell sorting of parental tumor failed to recapitulate the TIC phenotype in the orthotopic model, highlighting the complexity of the CSC model. Our data support the NB stem-like cells as a dynamic and heterogeneous cell population strongly dependent on microenvironmental signals and add novel candidate genes as potential therapeutic targets in the control of high-risk NB.
Resumo:
This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS).
Resumo:
OBJECTIVES: To document biopsychosocial profiles of patients with rheumatoid arthritis (RA) by means of the INTERMED and to correlate the results with conventional methods of disease assessment and health care utilization. METHODS: Patients with RA (n = 75) were evaluated with the INTERMED, an instrument for assessing case complexity and care needs. Based on their INTERMED scores, patients were compared with regard to severity of illness, functional status, and health care utilization. RESULTS: In cluster analysis, a 2-cluster solution emerged, with about half of the patients characterized as complex. Complex patients scoring especially high in the psychosocial domain of the INTERMED were disabled significantly more often and took more psychotropic drugs. Although the 2 patient groups did not differ in severity of illness and functional status, complex patients rated their illness as more severe on subjective measures and on most items of the Medical Outcomes Study Short Form 36. Complex patients showed increased health care utilization despite a similar biologic profile. CONCLUSIONS: The INTERMED identified complex patients with increased health care utilization, provided meaningful and comprehensive patient information, and proved to be easy to implement and advantageous compared with conventional methods of disease assessment. Intervention studies will have to demonstrate whether management strategies based on INTERMED profiles can improve treatment response and outcome of complex patients.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
The study tested three analytic tools applied in SLA research (T-unit, AS-unit and Idea-unit) against FL learner monologic oral data. The objective was to analyse their effectiveness for the assessment of complexity of learners' academic production in English. The data were learners' individual productions gathered during the implementation of a CLIL teaching sequence on Natural Sciences in a Catalan state secondary school. The analysis showed that only AS-unit was easily applicable and highly effective in segmenting the data and taking complexity measures
Resumo:
Independent regulatory agencies are one of the main institutional features of the 'rising regulatory state' in Western Europe. Governments are increasingly willing to abandon their regulatory competencies and to delegate them to specialized institutions that are at least partially beyond their control. This article examines the empirical consistency of one particular explanation of this phenomenon, namely the credibility hypothesis, claiming that governments delegate powers so as to enhance the credibility of their policies. Three observable implications are derived from the general hypothesis, linking credibility and delegation to veto players, complexity and interdependence. An independence index is developed to measure agency independence, which is then used in a multivariate analysis where the impact of credibility concerns on delegation is tested. The analysis relies on an original data set comprising independence scores for thirty-three regulators. Results show that the credibility hypothesis can explain a good deal of the variation in delegation. The economic nature of regulation is a strong determinant of agency independence, but is mediated by national institutions in the form of veto players.
Resumo:
SUMMARY: Large sets of data, such as expression profiles from many samples, require analytic tools to reduce their complexity. The Iterative Signature Algorithm (ISA) is a biclustering algorithm. It was designed to decompose a large set of data into so-called 'modules'. In the context of gene expression data, these modules consist of subsets of genes that exhibit a coherent expression profile only over a subset of microarray experiments. Genes and arrays may be attributed to multiple modules and the level of required coherence can be varied resulting in different 'resolutions' of the modular mapping. In this short note, we introduce two BioConductor software packages written in GNU R: The isa2 package includes an optimized implementation of the ISA and the eisa package provides a convenient interface to run the ISA, visualize its output and put the biclusters into biological context. Potential users of these packages are all R and BioConductor users dealing with tabular (e.g. gene expression) data. AVAILABILITY: http://www.unil.ch/cbg/ISA CONTACT: sven.bergmann@unil.ch
Resumo:
Clonally complex infections by Mycobacterium tuberculosis are progressively more accepted. Studies of their dimension in epidemiological scenarios where the infective pressure is not high are scarce. Our study systematically searched for clonally complex infections (mixed infections by more than one strain and simultaneous presence of clonal variants) by applying mycobacterial interspersed repetitive-unit (MIRU)-variable-number tandem-repeat (VNTR) analysis to M. tuberculosis isolates from two population-based samples of respiratory (703 cases) and respiratory-extrapulmonary (R+E) tuberculosis (TB) cases (71 cases) in a context of moderate TB incidence. Clonally complex infections were found in 11 (1.6%) of the respiratory TB cases and in 10 (14.1%) of those with R+E TB. Among the 21 cases with clonally complex TB, 9 were infected by 2 independent strains and the remaining 12 showed the simultaneous presence of 2 to 3 clonal variants. For the 10 R+E TB cases with clonally complex infections, compartmentalization (different compositions of strains/clonal variants in independent infected sites) was found in 9 of them. All the strains/clonal variants were also genotyped by IS6110-based restriction fragment length polymorphism analysis, which split two MIRU-defined clonal variants, although in general, it showed a lower discriminatory power to identify the clonal heterogeneity revealed by MIRU-VNTR analysis. The comparative analysis of IS6110 insertion sites between coinfecting clonal variants showed differences in the genes coding for a cutinase, a PPE family protein, and two conserved hypothetical proteins. Diagnostic delay, existence of previous TB, risk for overexposure, and clustered/orphan status of the involved strains were analyzed to propose possible explanations for the cases with clonally complex infections. Our study characterizes in detail all the clonally complex infections by M. tuberculosis found in a systematic survey and contributes to the characterization that these phenomena can be found to an extent higher than expected, even in an unselected population-based sample lacking high infective pressure.