910 resultados para Support Decision System
Resumo:
The characterization and grading of glioma tumors, via image derived features, for diagnosis, prognosis, and treatment response has been an active research area in medical image computing. This paper presents a novel method for automatic detection and classification of glioma from conventional T2 weighted MR images. Automatic detection of the tumor was established using newly developed method called Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA).Statistical Features were extracted from the detected tumor texture using first order statistics and gray level co-occurrence matrix (GLCM) based second order statistical methods. Statistical significance of the features was determined by t-test and its corresponding p-value. A decision system was developed for the grade detection of glioma using these selected features and its p-value. The detection performance of the decision system was validated using the receiver operating characteristic (ROC) curve. The diagnosis and grading of glioma using this non-invasive method can contribute promising results in medical image computing
Resumo:
VisGeo it's a software to support decision making concerning risk's studies and that communication; based in free software technology and developed in to the Geographical Visualization approach. (...)
Resumo:
Wydział Chemii
Resumo:
Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.
Resumo:
Information architecture (IA) is defined as high level information requirements of an organisation. It is applied in areas such as information systems development, enterprise architecture, business processes management and organisational change management. Still, the lack of methods and theories prevents information architecture becoming a distinct discipline. Healthcare organisation is always seen as information intensive organisation, moreover in a pervasive healthcare environment. Pervasive healthcare aims to provide healthcare services to anyone, anywhere and anytime by incorporating mobile devices and wireless network. Information architecture hence plays an important role in information provisioning within the context of pervasive healthcare in order to support decision making and communication between clinician and patients. Organisational semiotics is one of the social technical approaches that contemplate information through the norms or activities performed within an organisation prior to pervasive healthcare implementation. This paper proposes a conceptual design of information architecture for pervasive healthcare. It is illustrated with a scenario of mental health patient monitoring.
Resumo:
There are now many reports of imaging experiments with small cohorts of typical participants that precede large-scale, often multicentre studies of psychiatric and neurological disorders. Data from these calibration experiments are sufficient to make estimates of statistical power and predictions of sample size and minimum observable effect sizes. In this technical note, we suggest how previously reported voxel-based power calculations can support decision making in the design, execution and analysis of cross-sectional multicentre imaging studies. The choice of MRI acquisition sequence, distribution of recruitment across acquisition centres, and changes to the registration method applied during data analysis are considered as examples. The consequences of modification are explored in quantitative terms by assessing the impact on sample size for a fixed effect size and detectable effect size for a fixed sample size. The calibration experiment dataset used for illustration was a precursor to the now complete Medical Research Council Autism Imaging Multicentre Study (MRC-AIMS). Validation of the voxel-based power calculations is made by comparing the predicted values from the calibration experiment with those observed in MRC-AIMS. The effect of non-linear mappings during image registration to a standard stereotactic space on the prediction is explored with reference to the amount of local deformation. In summary, power calculations offer a validated, quantitative means of making informed choices on important factors that influence the outcome of studies that consume significant resources.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Nos dias de hoje existe uma grande demanda e pressão na seleção e definição de prioridades das alternativas de investimento para alavancar o crescimento de longo prazo das empresas. Em paralelo a este cenário, o ambiente global está cada vez mais incerto, o que implica que as escolhas realizadas por estas empresas devem se adaptar aos novos desejos do mercado e, principalmente, devem manter o direcionamento de crescimento almejado pelas mesmas. Neste contexto conturbado, as ferramentas tradicionais utilizadas para a tomada de decisão, para selecionar e definir as prioridades são as análises econômico-financeira representadas pelo Valor Presente Líquido, a Taxa Interna de Retorno e o Payback. Apesar de estes itens serem métodos robustos e consistentes na avaliação de projetos de investimentos, eles focam apenas em um aspecto (o financeiro), e as empresas, atualmente, estão envolvidas em ambientes que precisam de uma abordagem mais ampla, contemplando outras visões e dimensões não presentes nos estudos financeiros. Ou seja, quando se faz uma análise de carteira de projetos alinhada ao planejamento estratégico, é necessário realizar uma abordagem multicritério envolvendo indicadores quantitativos e qualitativos e disponibilizando aos tomadores de decisão uma informação completa e padronizada de todos os projetos, uma vez que estas iniciativas não possuem características homogêneas, pois cada uma apresenta sua respectiva particularidade e, principalmente, está em diferentes estágios de maturidade. Aliado a estes pontos, é perceptível que o processo de seleção e priorização de projetos necessita de uma sistematização que garanta a esta decisão e a este Portfólio uma maior estabilidade e fidedignidade das informações. Neste trabalho, portanto, foi elaborada uma análise multivariada, mais especificamente, a utilização de sistemas de apoio à tomada de decisão. Foram escolhidos outros critérios além do econômico-financeiro, para suportar a seleção e priorização de projetos no atendimento dos objetivos estratégicos da organização e de seus stakeholders.
Resumo:
Em redes de inovação baseadas em trocas de informação, o agente orquestrador se apropria das informações dos atores periféricos, gera inovação e distribui em forma de valor agregado. É sua função promover a estabilidade na rede fazendo com que a mesma tenha taxas não negativas de crescimento. Nos mercados de análise de crédito e fraude, por exemplo, ou bureaus funcionam como agentes orquestradores, concentrando as informações históricas da população que são provenientes de seus clientes e fornecendo produtos que auxiliam na tomada de decisão. Assumindo todas as empresas do ecossistema como agentes racionais, a teoria dos jogos se torna uma ferramenta apropriada para o estudo da precificação dos produtos como mecanismo de promoção da estabilidade da rede. Este trabalho busca identificar a relação de diferentes estruturas de precificação promovidas pelo agente orquestrador com a estabilidade e eficiência da rede de inovação. Uma vez que o poder da rede se dá pela força conjunta de seus membros, a inovação por esta gerada varia de acordo com a decisão isolada de cada agente periférico de contratar o agente orquestrador ao preço por ele estipulado. Através da definição de um jogo teórico simplificado onde diferentes agentes decidem conectar-se ou não à rede nas diferentes estruturas de preços estipuladas pelo agente orquestrador, o estudo analisa as condições de equilíbrio conclui que o equilíbrio de Nash implica em um cenário de estabilidade da rede. Uma conclusão é que, para maximizar o poder de inovação da rede, o preço a ser pago por cada agente para fazer uso da rede deve ser diretamente proporcional ao benefício financeiro auferido pela inovação gerada pela mesma. O estudo apresenta ainda uma simulação computacional de um mercado fictício para demonstração numérica dos efeitos observados. Através das conclusões obtidas, o trabalho cobre uma lacuna da literatura de redes de inovação com agentes orquestradores monopolistas em termos de precificação do uso da rede, servindo de subsídio de tomadores de decisão quando da oferta ou demanda dos serviços da rede.
Resumo:
Noise mapping has been used as an instrument for assessment of environmental noise, helping to support decision making on urban planning. In Brazil, urban noise is not yet recognized as a major environmental problem by the government. Besides, cities that have databases to drive acoustic simulations, making use of advanced noise mapping systems, are rare. This study sought an alternative method of noise mapping through the use of geoprocessing, which is feasible for the Brazilian reality and for other developing countries. The area chosen for the study was the central zone of the city of Sorocaba, located in So Paulo State, Brazil. The proposed method was effective in the spatial evaluation of equivalent sound pressure level. The results showed an urban area with high noise levels that exceed the legal standard, posing a threat to the welfare of the population.
Resumo:
The semiarid potiguar presents a quite discrepant. It is a region with one of the highest rates of artificial lake the world, but the policy of building dams to mitigate the problem of water scarcity does not solve, given that they have not demonstrated the ability to ensure supply human priority during periods of great drought and fail to solve the widespread demand existent in the semiarid. This work aims to present the optimal allocation of water, according to multiple uses and limited availability of water resources in the reservoir, from the simulation of the operation of the same, with the application of techniques to support decision making and performance evaluation alternatives for water use. The reservoir of Santa Cruz, the second largest reservoir of RN with storage capacity of approximately 600 million cubic meters, located about 20 km from the town of Apodi in RN, was conceived as a way to promote economic development in the region as well as the water supply of nearby towns. The techniques used are the simulation model of network flow ACQUANET and also the set of performance indicators. The results showed that the container has the capacity to serve up to 3,83m3/s flow required by existing uses, without any compromising the same. However, it was also observed that all anticipated future demands are implemented it will generate failures in meeting some uses
Resumo:
Postbloom fruit drop (PFD) of citrus, caused by Colletotrichum acutatum, infects petals of citrus flowers and produces orange-brown lesions that induce the abscission of young fruitlets and the retention of calyces. Proper timing of fungicide applications is essential for good disease control. Different systems for timing of fungicide applications for control of PFD in a major citrus-growing region in southern São Paulo state in Brazil were evaluated from 1999 to 2002. The following programs were compared to an unsprayed control using counts of diseased flowers, persistent calyces, or fruit: (i) a phenology-based program currently recommended in Brazil with one application at early and another at peak bloom; (ii) the Florida PFD model; (iii) the postbloom fruit drop-fungicide application decision system (PFD-FAD), a new computer-assisted decision method; and (iv) grower's choice. In 1999, no disease developed, sprays applied with the phenology-based program had no effect, and the Florida PFD model saved two sprays compared with the phenology-based program. In 2000, PFD was moderate and the phenology-based and growers' choice treatments had a significantly lower number of persistent calyces and higher fruit numbers than the control, but no differences were found between those treatments and the PFD model. In 2001, PFD was severe with considerable yield loss. The PFD model, the phenology-based program, and the grower's choice reduced flower blight and the number of persistent calyces, and improved fruit yields with two to three applications, but the PFD-FAD achieved comparable yields with only one spray. In 2002, the disease was mild, with no yield loss, and the Florida PFD model and the PFD-FAD saved one spray compared with the other systems. The PFD model and the PFD-FAD were equally effective for timing fungicide applications to control PFD in Brazil. Scouting of trees is simpler with PFD-FAD; therefore, this system is recommended and should eliminate unnecessary sprays and reduce costs for growers.
Resumo:
Includes bibliography
Resumo:
For intricate automotive systems that enclose several components, such as gearboxes, an important aspect of the design is defining the correct assembly parameters. A proper assembly can ensure optimized operating conditions and therefore the components can achieve a longer life. In the case of the support bearings applied to front-axle lightweight differentials, the assembly preload is a major aspect for an adequate performance of the system. During the design phase it is imperative to define reference values to this preload, so the application would endure its requirements. However, with the assistance of computer simulations, it is possible to determine an optimum condition of operation, i.e. optimum pre-load, which would increase the system reliability. This paper presents a study on the influence of preload on the rating life of tapered roller bearings applied to light-weight front axle differentials, evaluating how preload affects several key parameters such as rating life and displacement of components, taking into account the flexibility of the surrounding differential housing. Copyright © 2012 SAE International.
Resumo:
Objectives: Primary failure of tooth eruption (PFE) is a rare autosomal-dominant disease characterized by severe lateral open bite as a consequence of incomplete eruption of posterior teeth. Heterozygous mutations in the parathyroid hormone 1 receptor (PTH1R) gene have been shown to cause PFE likely due to protein haploinsufficiency. To further expand on the mutational spectrum of PFE-associated mutations, we report here on the sequencing results of the PTH1R gene in 70 index PFE cases. Materials and methods: Sanger sequencing of the PTH1R coding exons and their immediate flanking intronic sequences was performed with DNA samples from 70 index PFE cases. Results: We identified a total of 30 unique variants, of which 12 were classified as pathogenic based on their deleterious consequences on PTH1R protein while 16 changes were characterized as unclassified variants with as yet unknown effects on disease pathology. The remaining two variants represent common polymorphisms. Conclusions: Our data significantly increase the number of presently known unique PFE-causing PTH1R mutations and provide a series of variants with unclear pathogenicity which will require further in vitro assaying to determine their effects on protein structure and function. Clinical relevance: Management of PTH1R-associated PFE is problematic, in particular when teeth are exposed to orthodontic force. Therefore, upon clinical suspicion of PFE, molecular DNA testing is indicated to support decision making for further treatment options. © 2013 Springer-Verlag Berlin Heidelberg.