901 resultados para probabilistic roadmap
Resumo:
Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.
To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.
To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.
Resumo:
For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept beyond time-dependent measures to other variables of interest. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Ni ? no/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictors.
Resumo:
Decades of costly failures in translating drug candidates from preclinical disease models to human therapeutic use warrant reconsideration of the priority placed on animal models in biomedical research. Following an international workshop attended by experts from academia, government institutions, research funding bodies, and the corporate and nongovernmental organisation (NGO) sectors, in this consensus report, we analyse, as case studies, five disease areas with major unmet needs for new treatments. In view of the scientifically driven transition towards a human pathway-based paradigm in toxicology, a similar paradigm shift appears to be justified in biomedical research. There is a pressing need for an approach that strategically implements advanced, human biology-based models and tools to understand disease pathways at multiple biological scales. We present recommendations to help achieve this.
Resumo:
Event extraction from texts aims to detect structured information such as what has happened, to whom, where and when. Event extraction and visualization are typically considered as two different tasks. In this paper, we propose a novel approach based on probabilistic modelling to jointly extract and visualize events from tweets where both tasks benefit from each other. We model each event as a joint distribution over named entities, a date, a location and event-related keywords. Moreover, both tweets and event instances are associated with coordinates in the visualization space. The manifold assumption that the intrinsic geometry of tweets is a low-rank, non-linear manifold within the high-dimensional space is incorporated into the learning framework using a regularization. Experimental results show that the proposed approach can effectively deal with both event extraction and visualization and performs remarkably better than both the state-of-the-art event extraction method and a pipeline approach for event extraction and visualization.
Resumo:
In restructured power systems, generation and commercialization activities became market activities, while transmission and distribution activities continue as regulated monopolies. As a result, the adequacy of transmission network should be evaluated independent of generation system. After introducing the constrained fuzzy power flow (CFPF) as a suitable tool to quantify the adequacy of transmission network to satisfy 'reasonable demands for the transmission of electricity' (as stated, for instance, at European Directive 2009/72/EC), the aim is now showing how this approach can be used in conjunction with probabilistic criteria in security analysis. In classical security analysis models of power systems are considered the composite system (generation plus transmission). The state of system components is usually modeled with probabilities and loads (and generation) are modeled by crisp numbers, probability distributions or fuzzy numbers. In the case of CFPF the component’s failure of the transmission network have been investigated. In this framework, probabilistic methods are used for failures modeling of the transmission system components and possibility models are used to deal with 'reasonable demands'. The enhanced version of the CFPF model is applied to an illustrative case.
Resumo:
Mobile and wireless networks have long exploited mobility predictions, focused on predicting the future location of given users, to perform more efficient network resource management. In this paper, we present a new approach in which we provide predictions as a probability distribution of the likelihood of moving to a set of future locations. This approach provides wireless services a greater amount of knowledge and enables them to perform more effectively. We present a framework for the evaluation of this new type of predictor, and develop 2 new predictors, HEM and G-Stat. We evaluate our predictors accuracy in predicting future cells for mobile users, using two large geolocation data sets, from MDC [11], [12] and Crawdad [13]. We show that our predictors can successfully predict with as low as an average 2.2% inaccuracy in certain scenarios.
Resumo:
O objetivo primordial deste trabalho foi estabelecer um roteiro tecnológico para aplicação das tecnologias de “Captação, Utilização e Sequestração de Carbono - CCUS” em Portugal. Para o efeito procedeu-se à identificação da origem das maiores fontes emissoras estacionárias industriais de CO2, adotando como critério o valor mínimo de 1×105 ton CO2/ano e limitado apenas ao território continental. Com base na informação recolhida e referente aos dados oficiais mais recentes (ano de 2013), estimou-se que o volume de emissões industriais de CO2 possível de captar em Portugal, corresponde a cerca de 47 % do valor global das emissões industriais, sendo oriundo de três setores de atividade industrial: produção de cimento, de pasta de papel e centrais termoelétricas a carvão. A maioria das grandes fontes emissoras industriais localiza-se no litoral do país, concentrando-se entre Aveiro e Sines. Pelas condicionantes geográficas do país e, sobretudo pela vantagem de já existir uma rede de gasodutos para o transporte de gás natural, com as respetivas infraestruturas de apoio associadas, admitiu-se que o cenário mais favorável para o transporte do CO2 captado será a criação de um sistema de transporte por gasoduto específico para o CO2. Como critério de compatibilização da proximidade das fontes emissoras de CO2 com potenciais locais para o armazenamento geológico das correntes captadas, adotou-se a distância máxima de 100 km, considerada adequada perante a dimensão do território nacional e as características do tecido industrial nacional. Efetuou-se a revisão das tecnologias de captação de CO2 disponíveis, quer comercialmente, quer em níveis avançados de demonstração e procedeu-se à análise exploratória da adequação desses diferentes métodos de captação a cada um dos setores de atividade industrial previamente identificados com emissões de CO2 suscetíveis de serem captadas. Na perspetiva da melhor integração dos processos, esta análise preliminar tomou em consideração as características das misturas gasosas, assim como o contexto industrial correspondente e o processo produtivo que lhe dá origem. As possibilidades de utilização industrial do CO2 sujeito à captação no país foram tratadas neste trabalho de forma genérica dado que a identificação de oportunidades reais para a utilização de correntes de CO2 captadas exige uma análise de compatibilização das necessidades efetivas de utilização de CO2 por parte de potenciais utilizadores industriais que carece da caracterização prévia das propriedades dessas correntes. Este é um tipo de análise muito específico que pressupõe o interesse mútuo de diferentes intervenientes: agentes emissores de CO2, operadores de transporte e, principalmente, potenciais utilizadores de CO2 como: matéria-prima para a síntese de compostos, solvente de extração supercrítica na indústria alimentar ou farmacêutica, agente corretor de pH em tratamento de efluentes, biofixação por fotossíntese, ou outra das aplicações possíveis identificadas para o CO2 captado. A última etapa deste estudo consistiu na avaliação das possibilidades de armazenamento geológico do CO2 captado e envolveu a identificação, nas bacias sedimentares nacionais, de formações geológicas com características reconhecidas como sendo boas indicações para o armazenamento de CO2 de forma permanente e em segurança. Seguiu-se a metodologia preconizada por organizações internacionais aplicando à situação nacional, critérios de seleção e de segurança que se encontram reconhecidamente definidos. A adequação para o armazenamento de CO2 das formações geológicas pré-selecionadas terá que ser comprovada por estudos adicionais que complementem os dados já existentes sobre as características geológicas destas formações e, mais importante ainda, por testes laboratoriais e ensaios de injeção de CO2 que possam fornecer informação concreta para estimar a capacidade de sequestração e de retenção de CO2 nestas formações e estabelecer os modelos geológicos armazenamento que permitam identificar e estimar, de forma concreta e objetiva, os riscos associados à injeção e armazenamento de CO2.
Resumo:
Bayesian Belief Networks (BBNs) are emerging as valuable tools for investigating complex ecological problems. In a BBN, the important variables in a problem are identified and causal relationships are represented graphically. Underpinning this is the probabilistic framework in which variables can take on a finite range of mutually exclusive states. Associated with each variable is a conditional probability table (CPT), showing the probability of a variable attaining each of its possible states conditioned on all possible combinations of it parents. Whilst the variables (nodes) are connected, the CPT attached to each node can be quantified independently. This allows each variable to be populated with the best data available, including expert opinion, simulation results or observed data. It also allows the information to be easily updated as better data become available ----- ----- This paper reports on the process of developing a BBN to better understand the initial rapid growth phase (initiation) of a marine cyanobacterium, Lyngbya majuscula, in Moreton Bay, Queensland. Anecdotal evidence suggests that Lyngbya blooms in this region have increased in severity and extent over the past decade. Lyngbya has been associated with acute dermatitis and a range of other health problems in humans. Blooms have been linked to ecosystem degradation and have also damaged commercial and recreational fisheries. However, the causes of blooms are as yet poorly understood.
Resumo:
“SOH see significant benefit in digitising its drawings and operation and maintenance manuals. Since SOH do not currently have digital models of the Opera House structure or other components, there is an opportunity for this national case study to promote the application of Digital Facility Modelling using standardized Building Information Models (BIM)”. The digital modelling element of this project examined the potential of building information models for Facility Management focusing on the following areas: • The re-usability of building information for FM purposes • BIM as an Integrated information model for facility management • Extendibility of the BIM to cope with business specific requirements • Commercial facility management software using standardised building information models • The ability to add (organisation specific) intelligence to the model • A roadmap for SOH to adopt BIM for FM The project has established that BIM – building information modelling - is an appropriate and potentially beneficial technology for the storage of integrated building, maintenance and management data for SOH. Based on the attributes of a BIM, several advantages can be envisioned: consistency in the data, intelligence in the model, multiple representations, source of information for intelligent programs and intelligent queries. The IFC – open building exchange standard – specification provides comprehensive support for asset and facility management functions, and offers new management, collaboration and procurement relationships based on sharing of intelligent building data. The major advantages of using an open standard are: information can be read and manipulated by any compliant software, reduced user “lock in” to proprietary solutions, third party software can be the “best of breed” to suit the process and scope at hand, standardised BIM solutions consider the wider implications of information exchange outside the scope of any particular vendor, information can be archived as ASCII files for archival purposes, and data quality can be enhanced as the now single source of users’ information has improved accuracy, correctness, currency, completeness and relevance. SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. There have been remarkably few technical difficulties in converting the House’s existing conventions and standards to the new model based environment. This demonstrates that the IFC model represents world practice for building data representation and management (see Sydney Opera House – FM Exemplar Project Report Number 2005-001-C-3, Open Specification for BIM: Sydney Opera House Case Study). Availability of FM applications based on BIM is in its infancy but focussed systems are already in operation internationally and show excellent prospects for implementation systems at SOH. In addition to the generic benefits of standardised BIM described above, the following FM specific advantages can be expected from this new integrated facilities management environment: faster and more effective processes, controlled whole life costs and environmental data, better customer service, common operational picture for current and strategic planning, visual decision-making and a total ownership cost model. Tests with partial BIM data – provided by several of SOH’s current consultants – show that the creation of a SOH complete model is realistic, but subject to resolution of compliance and detailed functional support by participating software applications. The showcase has demonstrated successfully that IFC based exchange is possible with several common BIM based applications through the creation of a new partial model of the building. Data exchanged has been geometrically accurate (the SOH building structure represents some of the most complex building elements) and supports rich information describing the types of objects, with their properties and relationships.
Resumo:
This Digital Modelling Report incorporates the previous research completed for the FM Exemplar Project utilising the Sydney Opera House as a case study. The research has demonstrated significant benefits in digitising design documentation and operational and maintenance manuals. Since Sydney Opera House do not have digital models of its structure, there is an opportunity to investigate the application of Digital Facility Modelling using standardised Building Information Models (BIM). The digital modelling research project has examined the potential of standardised building information models to develop a digital facility model supporting facilities management (FM). The focus of this investigation was on the following areas: • The re-usability of standardised building information models (BIM) for FM purposes. • The potential of BIM as an information framework acting as integrator for various FM data sources. • The extendibility and flexibility of the BIM to cope with business specific data and requirements. • Commercial FM software using standardised building information models. • The ability to add (organisation-specific) intelligence to the model. • A roadmap for Sydney Opera House to adopt BIM for FM.
Resumo:
The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.
Resumo:
In Australia, an average 49 building and construction workers have been killed at work each year since 1997-98. Building/construction workers are more than twice as likely to be killed at work, than the average worker in all Australian industries. The ‘Safer Construction’ project, funded by the CRC-Construction Innovation and led by a task force comprising representatives of construction clients, designers and constructors, developed a Guide to Best Practice for Safer Construction. The Guide, which was informed by research undertaken at RMIT University, Queensland University of Technology and Curtin University, establishes broad principles for the improvement of safety in the industry and provides a ‘roadmap’ for improvement based upon lifecycle stages of a building/construction project. Within each project stage, best practices for the management of safety are identified. Each best practice is defined in terms of the recommended action, its key benefits, desirable outcomes, performance measures and leadership. ‘Safer Construction’ practices are identified from the planning to commissioning stages of a project. The ‘Safer Construction’ project represents the first time that key stakeholder groups in the Australian building/construction industry have worked together to articulate best practice and establish an appropriate basis for allocating (and sharing) responsibility for project safety performance.
Resumo:
The digital modelling research stream of the Sydney Opera House FM Exemplar Project has demonstrated significant benefits in digitising design documentation and operational and maintenance manuals. Since Sydney Opera House did not have digital models of its structure, there was an opportunity to investigate the application of digital modelling using standardised Building Information Models (BIM) to support facilities management (FM).The focus of this investigation was on the following areas:the re-usability of standardised BIM for FM purposesthe potential of BIM as an information framework acting as integrator for various FM data sources the extendibility and flexibility of the BIM to cope with business-specific data and requirements commercial FM software using standardised BIMthe ability to add (organisation-specific) intelligence to the modela roadmap for Sydney Opera House to adopt BIM for FM.