973 resultados para Risk - Mathematical models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mathematical models are useful tools for simulation, evaluation, optimal operation and control of solar cells and proton exchange membrane fuel cells (PEMFCs). To identify the model parameters of these two type of cells efficiently, a biogeography-based optimization algorithm with mutation strategies (BBO-M) is proposed. The BBO-M uses the structure of biogeography-based optimization algorithm (BBO), and both the mutation motivated from the differential evolution (DE) algorithm and the chaos theory are incorporated into the BBO structure for improving the global searching capability of the algorithm. Numerical experiments have been conducted on ten benchmark functions with 50 dimensions, and the results show that BBO-M can produce solutions of high quality and has fast convergence rate. Then, the proposed BBO-M is applied to the model parameter estimation of the two type of cells. The experimental results clearly demonstrate the power of the proposed BBO-M in estimating model parameters of both solar and fuel cells.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within a defined law framework, the Italian central health system dictates the standards for hospitalization to local care units, which are in turn allowed to establish their own effectiveness criteria. The appropriateness of the hospitalization decision is therefore predetermined at patients admission, whereas its effectiveness relies on the ex post patient well-being as a result of the complex system of reciprocal relations between patients and healthcare agents at the ward level. We consider the outcomes in geriatric wards referring to the national health system, with respect both to patients traits at the individual level and wards/hospital settings. The risk that models the healthcare outcome is accordingly adjusted for covariates at the different levels of analysis (Goldstein & Spiegelhalter, 1996), thus allowing to differentiate among outcomes in terms of the hospitalization structure and, when appropriate, of territorial aggregation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Whole genome sequencing (WGS) technology holds great promise as a tool for the forensic epidemiology of bacterial pathogens. It is likely to be particularly useful for studying the transmission dynamics of an observed epidemic involving a largely unsampled 'reservoir' host, as for bovine tuberculosis (bTB) in British and Irish cattle and badgers. BTB is caused by Mycobacterium bovis, a member of the M. tuberculosis complex that also includes the aetiological agent for human TB. In this study, we identified a spatio-temporally linked group of 26 cattle and 4 badgers infected with the same Variable Number Tandem Repeat (VNTR) type of M. bovis. Single-nucleotide polymorphisms (SNPs) between sequences identified differences that were consistent with bacterial lineages being persistent on or near farms for several years, despite multiple clear whole herd tests in the interim. Comparing WGS data to mathematical models showed good correlations between genetic divergence and spatial distance, but poor correspondence to the network of cattle movements or within-herd contacts. Badger isolates showed between zero and four SNP differences from the nearest cattle isolate, providing evidence for recent transmissions between the two hosts. This is the first direct genetic evidence of M. bovis persistence on farms over multiple outbreaks with a continued, ongoing interaction with local badgers. However, despite unprecedented resolution, directionality of transmission cannot be inferred at this stage. Despite the often notoriously long timescales between time of infection and time of sampling for TB, our results suggest that WGS data alone can provide insights into TB epidemiology even where detailed contact data are not available, and that more extensive sampling and analysis will allow for quantification of the extent and direction of transmission between cattle and badgers. © 2012 Biek et al.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.

DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).

SETTING: UK health services perspective.

PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).

MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).

RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.

CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A desmaterialização da economia é um dos caminhos para a promoção do desenvolvimento sustentável na medida em que elimina ou reduz a utilização de recursos naturais, fazendo mais com menos. A intensificação dos processos tecnológicos é uma forma de desmaterializar a economia. Sistemas mais compactos e mais eficientes consomem menos recursos. No caso concreto dos sistemas envolvendo processo de troca de calor, a intensificação resulta na redução da área de permuta e da quantidade de fluido de trabalho, o que para além de outra vantagem que possa apresentar decorrentes da miniaturização, é um contributo inegável para a sustentabilidade da sociedade através do desenvolvimento científico e tecnológico. O desenvolvimento de nanofluidos surge no sentido de dar resposta a estes tipo de desafios da sociedade moderna, contribuindo para a inovação de produtos e sistemas, dando resposta a problemas colocados ao nível das ciências de base. A literatura é unânime na identificação do seu potencial como fluidos de permuta, dada a sua elevada condutividade, no entanto a falta de rigor subjacente às técnicas de preparação dos mesmos, assim como de um conhecimento sistemático das suas propriedades físicas suportado por modelos físico-matemáticos devidamente validados levam a que a operacionalização industrial esteja longe de ser concretizável. Neste trabalho, estudou-se de forma sistemática a condutividade térmica de nanofluidos de base aquosa aditivados com nanotubos de carbono, tendo em vista a identificação dos mecanismos físicos responsáveis pela condução de calor no fluido e o desenvolvimento de um modelo geral que permita com segurança determinar esta propriedade com o rigor requerido ao nível da engenharia. Para o efeito apresentam-se métodos para uma preparação rigorosa e reprodutível deste tipo de nanofluido assim como das metodologias consideradas mais importantes para a aferição da sua estabilidade, assegurando deste modo o rigor da técnica da sua produção. A estabilidade coloidal é estabelecida de forma rigorosa tendo em conta parâmetros quantificáveis como a ausência de aglomeração, a separação de fases e a deterioração da morfologia das nanopartículas. Uma vez assegurado o método de preparação dos nanofluídos, realizou-se uma análise paramétrica conducente a uma base de dados obtidos experimentalmente que inclui a visão central e globalizante da influência relativa dos diferentes fatores de controlo com impacto nas propriedades termofísicas. De entre as propriedades termofísicas, este estudo deu particular ênfase à condutividade térmica, sendo os fatores de controlo selecionados os seguintes: fluido base, temperatura, tamanho da partícula e concentração de nanopartículas. Experimentalmente, verificou-se que de entre os fatores de controlo estudados, os que maior influência detêm sobre a condutividade térmica do nanofluido, são o tamanho e concentração das nanopartículas. Com a segurança conferida por uma base de dados sólida e com o conhecimento acerca da contribuição relativa de cada fator de controlo no processo de transferência de calor, desenvolveu-se e validou-se um modelo físico-matemático com um caracter generalista, que permitirá determinar com segurança a condutividade térmica de nanofluidos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate changes are foreseen to produce a large impact in the morphology of estuaries and coastal systems. The morphology changes will subsequently drive changes in the biologic compartments of the systems and ultimately in their ecosystems. Sea level rise is one of the main factors controlling these changes. Morphologic changes can be better understood with the use of long term morphodynamic mathematical models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trabalho Final para obtenção do grau Mestre em Engenharia Electrotécnica

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica no Ramo de Automação e Electrónica Industrial

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Em Angola, apenas cerca de 30% da população tem acesso à energia elétrica, nível que decresce para valores inferiores a 10% em zonas rurais mais remotas. Este problema é agravado pelo facto de, na maioria dos casos, as infraestruturas existentes se encontrarem danificadas ou não acompanharem o desenvolvimento da região. Em particular na capital angolana, Luanda que, sendo a menor província de Angola, é a que regista atualmente a maior densidade populacional. Com uma população de cerca de 5 milhões de habitantes, não só há frequentemente problemas relacionados com a falha do fornecimento de energia elétrica como há ainda uma percentagem considerável de municípios onde a rede elétrica ainda nem sequer chegou. O governo de Angola, no seu esforço de crescimento e aproveitamento das suas enormes potencialidades, definiu o setor energético como um dos fatores críticos para o desenvolvimento sustentável do país, tendo assumido que este é um dos eixos prioritários até 2016. Existem objetivos claros quanto à reabilitação e expansão das infraestruturas do setor elétrico, aumentando a capacidade instalada do país e criando uma rede nacional adequada, com o intuito não só de melhorar a qualidade e fiabilidade da rede já existente como de a aumentar. Este trabalho de dissertação consistiu no levantamento de dados reais relativamente à rede de distribuição de energia elétrica de Luanda, na análise e planeamento do que é mais premente fazer relativamente à sua expansão, na escolha dos locais onde é viável localizar novas subestações, na modelação adequada do problema real e na proposta de uma solução ótima para a expansão da rede existente. Depois de analisados diferentes modelos matemáticos aplicados ao problema de expansão de redes de distribuição de energia elétrica encontrados na literatura, optou-se por um modelo de programação linear inteira mista (PLIM) que se mostrou adequado. Desenvolvido o modelo do problema, o mesmo foi resolvido por recurso a software de otimização Analytic Solver e CPLEX. Como forma de validação dos resultados obtidos, foi implementada a solução de rede no simulador PowerWorld 8.0 OPF, software este que permite a simulação da operação do sistema de trânsito de potências.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a new a coinfection model for hepatitis C virus (HCV) and the human immunodeficiency virus (HIV). We consider treatment for both diseases, screening, unawareness and awareness of HIV infection, and the use of condoms. We study the local stability of the disease-free equilibria for the full model and for the two submodels (HCV only and HIV only submodels). We sketch bifurcation diagrams for different parameters, such as the probabilities that a contact will result in a HIV or an HCV infection. We present numerical simulations of the full model where the HIV, HCV and double endemic equilibria can be observed. We also show numerically the qualitative changes of the dynamical behavior of the full model for variation of relevant parameters. We extrapolate the results from the model for actual measures that could be implemented in order to reduce the number of infected individuals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A globalização dos sistemas financeiros, ao longo dos anos, tem estimulado uma crescente necessidade de supervisão bancária nas instituições financeiras. O Comité de Supervisão Bancária de Basileia tem tido um papel crucial nesta área, estabelecendo princípios por via dos seus acordos entre as várias entidades nacionais de regulação e supervisão das maiores economias mundiais. Em 1988, foi criado o Acordo de Basileia (Basileia I) pelo Comité de Supervisão Bancária de forma a harmonizar os padrões de supervisão bancária. Este acordo estabeleceu mínimos de solvabilidade para o sistema bancário internacional no sentido de reforçar a sua solidez e estabilidade. Com o desenvolvimento de novas potências económicas e novas necessidades regulamentares, em Junho de 2004, foi publicado o novo Acordo de Capital – o Basileia II. Este acordo pretendia tornar os requisitos de capital mais sensíveis ao risco, promover a atuação das autoridades de supervisão e a disciplina de mercado (através do seu Pilar II) e encorajar a capacidade de cada instituição mensurar e gerir o seu risco. Em Setembro de 2010, o Acordo de Basileia III, com adoção prevista até 2019, veio reforçar estas medidas com a criação de um quadro regulamentar e de supervisão mais sólido, por parte das instituições de crédito. Surge, assim neste contexto, o Modelo de Avaliação de Risco (MAR) para o sector bancário. Em Portugal, o MAR tem como objetivo avaliar o perfil de risco das instituições de crédito, sujeitas à supervisão do Banco de Portugal, assim como apresentar o perfil de risco e a solidez da situação financeira de cada instituição de crédito. Este trabalho pretende avaliar o surgimento e a caracterização deste modelo e identificar as variáveis a ter em conta nos modelos de avaliação de risco a nível qualitativo e quantitativo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cerebral metabolism is compartmentalized between neurons and glia. Although glial glycolysis is thought to largely sustain the energetic requirements of neurotransmission while oxidative metabolism takes place mainly in neurons, this hypothesis is matter of debate. The compartmentalization of cerebral metabolic fluxes can be determined by (13)C nuclear magnetic resonance (NMR) spectroscopy upon infusion of (13)C-enriched compounds, especially glucose. Rats under light α-chloralose anesthesia were infused with [1,6-(13)C]glucose and (13)C enrichment in the brain metabolites was measured by (13)C NMR spectroscopy with high sensitivity and spectral resolution at 14.1 T. This allowed determining (13)C enrichment curves of amino acid carbons with high reproducibility and to reliably estimate cerebral metabolic fluxes (mean error of 8%). We further found that TCA cycle intermediates are not required for flux determination in mathematical models of brain metabolism. Neuronal tricarboxylic acid cycle rate (V(TCA)) and neurotransmission rate (V(NT)) were 0.45 ± 0.01 and 0.11 ± 0.01 μmol/g/min, respectively. Glial V(TCA) was found to be 38 ± 3% of total cerebral oxidative metabolism, accounting for more than half of neuronal oxidative metabolism. Furthermore, glial anaplerotic pyruvate carboxylation rate (V(PC)) was 0.069 ± 0.004 μmol/g/min, i.e., 25 ± 1% of the glial TCA cycle rate. These results support a role of glial cells as active partners of neurons during synaptic transmission beyond glycolytic metabolism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An analytical model for bacterial accumulation in a discrete fractllre has been developed. The transport and accumlllation processes incorporate into the model include advection, dispersion, rate-limited adsorption, rate-limited desorption, irreversible adsorption, attachment, detachment, growth and first order decay botl1 in sorbed and aqueous phases. An analytical solution in Laplace space is derived and nlln1erically inverted. The model is implemented in the code BIOFRAC vvhich is written in Fortran 99. The model is derived for two phases, Phase I, where adsorption-desorption are dominant, and Phase II, where attachment-detachment are dominant. Phase I ends yvhen enollgh bacteria to fully cover the substratllm have accllillulated. The model for Phase I vvas verified by comparing to the Ogata-Banks solution and the model for Phase II was verified by comparing to a nonHomogenous version of the Ogata-Banks solution. After verification, a sensitiv"ity analysis on the inpllt parameters was performed. The sensitivity analysis was condllcted by varying one inpllt parameter vvhile all others were fixed and observing the impact on the shape of the clirve describing bacterial concentration verSllS time. Increasing fracture apertllre allovvs more transport and thus more accllffilliation, "Vvhich diminishes the dllration of Phase I. The larger the bacteria size, the faster the sllbstratum will be covered. Increasing adsorption rate, was observed to increase the dllration of Phase I. Contrary to the aSSllmption ofllniform biofilm thickness, the accllffilliation starts frOll1 the inlet, and the bacterial concentration in aqlleous phase moving towards the olitiet declines, sloyving the accumulation at the outlet. Increasing the desorption rate, redllces the dliration of Phase I, speeding IIp the accllmlilation. It was also observed that Phase II is of longer duration than Phase I. Increasing the attachment rate lengthens the accliffililation period. High rates of detachment speeds up the transport. The grovvth and decay rates have no significant effect on transport, althollgh increases the concentrations in both aqueous and sorbed phases are observed. Irreversible adsorption can stop accllillulation completely if the vallIes are high.