899 resultados para information bottleneck method
Resumo:
Company X develops a laboratory information system (LIS) called System Y. The informationsystem has a two-tier database architecture consisting of a production database and a historicaldatabase. A database constitutes the backbone of a IS, which makes the design of the databasevery important. A poorly designed database can cause major problems within an organization.The two databases in System Y are poorly modeled, particularly the historical database. Thecause of the poor modeling was unclear concepts. The unclear concepts have remained in thedatabase and in the company organization and caused a general confusion of concepts. The splitdatabase architecture itself has evolved into a bottleneck and is the cause of many problemsduring the development of System Y.Company X investigates the possibility of integrating the historical database with the productiondatabase. The goal of our thesis is to conduct a consequence analysis of such integration andwhat the effects would be on System Y, and to create a new design for the integrated database.We will also examine and describe the practical effects of confusion of concepts for a databaseconceptual design.To achieve the goal of the thesis, five different method steps have been performed: a preliminarystudy of the organization, a change analysis, a consequence analysis and an investigation of theconceptual design of the database. These method steps have helped identify changes necessaryfor the organization, a new design proposal for an integrated database, the impact of theproposed design and a number of effects of confusion for the database.
Resumo:
Bakgrund: Internet används globalt inom många områden och aktuell forskning visar att gravida kvinnor använder Internet till att söka information om graviditet och förlossning. En graviditet innebär många frågor hos kvinnan och kvinnor söker efter information för att få en trygghet och försäkran om att allt är normalt med deras hälsa och barnet i magen. Syfte: Syftet är att undersöka gravida kvinnors informationssökning via Internet avseende graviditet och förlossning. Metod: Beskrivande tvärsnittstudie. Datainsamling skedde via en enkätundersökning och data analyserades med deskriptiv och jämförande ansats. Resultat: Nästan alla kvinnor i undersökningen använde Internet för att söka information om graviditet och förlossning. Kvinnorna söker information om barnets utveckling, kost/näring och graviditetslängd i hög utsträckning. Högutbildade kvinnor har andra kriterier för trovärdighet på hemsidor än kvinnor med lägre utbildning. Slutsats: Barnmorskor bör vara medvetna om gravida kvinnors användning av Internet samt själva skapa sig en uppfattning om informationen som de gravida kvinnorna läser på Internet. Vår studie visar att barnmorskors rekommendation av hemsidor på Internet har stor betydelse för gravida kvinnor.
Resumo:
Bakgrund: Diabetes Mellitus är kronisk sjukdom som är kopplat till lidande och förlust av livskvalitet. Egenvård är avgörande för att minska de negativa konsekvenserna. Mindre än hälften av alla diabetespatienter uppnår god egenvård. Anledningen är bland annat begränsad kunskap om diabetes och bristande egenvårdsföljsamhet. Införandet av Informations- och kommunikationsteknologi i diabetesvården påbörjades för att förbättra det kliniska resultatet och livskvaliteten för patienter med diabetes typ 2. Syfte: Att beskriva hur information och kommunikationsteknologi kan främja egenvård på distans för patienter med diabetes mellitus typ 2. Metod: Litteraturstudie, där artiklarna söktes i CINAHL, PubMed och Web of Science. Artiklarna som inkluderades var 15 artiklar med kvantitativ, kvalitativ samt mixed metod. Resultat: Resultatet visade att Information och kommunikationsteknologi såsom internet, dator och mobiltelefonbaserade egenvårdsprogram främjade egenvård hos patienter med diabetes typ 2 genom ökad kunskap, ökad medvetenhet, ökad motivation samt förbättrad livsstilsförändring i kost och motion. Slutsats: IKT som hjälpmedel kan underlätta dagliga utmaningarna för patienter med diabetes typ 2 eftersom den täcker kunskapsluckan och därtill ökar patienternas medvetenhet och motivation till egenvård.
Resumo:
Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.
Resumo:
This article highlights the potential benefits that the Kohonen method has for the classification of rivers with similar characteristics by determining regional ecological flows using the ELOHA (Ecological Limits of Hydrologic Alteration) methodology. Currently, there are many methodologies for the classification of rivers, however none of them include the characteristics found in Kohonen method such as (i) providing the number of groups that actually underlie the information presented, (ii) used to make variable importance analysis, (iii) which in any case can display two-dimensional classification process, and (iv) that regardless of the parameters used in the model the clustering structure remains. In order to evaluate the potential benefits of the Kohonen method, 174 flow stations distributed along the great river basin “Magdalena-Cauca” (Colombia) were analyzed. 73 variables were obtained for the classification process in each case. Six trials were done using different combinations of variables and the results were validated against reference classification obtained by Ingfocol in 2010, whose results were also framed using ELOHA guidelines. In the process of validation it was found that two of the tested models reproduced a level higher than 80% of the reference classification with the first trial, meaning that more than 80% of the flow stations analyzed in both models formed invariant groups of streams.
Resumo:
This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The stretch zone width (SZW) data for 15-5PH steel CTOD specimens fractured at -150 degrees C to + 23 degrees C temperature were measured based on focused images and 3D maps obtained by extended depth-of-field reconstruction from light microscopy (LM) image stacks. This LM-based method, with a larger lateral resolution, seems to be as effective for quantitative analysis of SZW as scanning electron microscopy (SEM) or confocal scanning laser microscopy (CSLM), permitting to clearly identify stretch zone boundaries. Despite the worst sharpness of focused images, a robust linear correlation was established to fracture toughness (KC) and SZW data for the 15-5PH steel tested specimens, measured at their center region. The method is an alternative to evaluate the boundaries of stretched zones, at a lower cost of implementation and training, since topographic data from elevation maps can be associated with reconstructed image, which summarizes the original contrast and brightness information. Finally, the extended depth-of-field method is presented here as a valuable tool for failure analysis, as a cheaper alternative to investigate rough surfaces or fracture, compared to scanning electron or confocal light microscopes. Microsc. Res. Tech. 75:11551158, 2012. (C) 2012 Wiley Periodicals, Inc.
Resumo:
A direct version of the boundary element method (BEM) is developed to model the stationary dynamic response of reinforced plate structures, such as reinforced panels in buildings, automobiles, and airplanes. The dynamic stationary fundamental solutions of thin plates and plane stress state are used to transform the governing partial differential equations into boundary integral equations (BIEs). Two sets of uncoupled BIEs are formulated, respectively, for the in-plane state ( membrane) and for the out-of-plane state ( bending). These uncoupled systems are joined to formamacro-element, in which membrane and bending effects are present. The association of these macro-elements is able to simulate thin-walled structures, including reinforced plate structures. In the present formulation, the BIE is discretized by continuous and/or discontinuous linear elements. Four displacement integral equations are written for every boundary node. Modal data, that is, natural frequencies and the corresponding mode shapes of reinforced plates, are obtained from information contained in the frequency response functions (FRFs). A specific example is presented to illustrate the versatility of the proposed methodology. Different configurations of the reinforcements are used to simulate simply supported and clamped boundary conditions for the plate structures. The procedure is validated by comparison with results determined by the finite element method (FEM).
Resumo:
PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Measurement of the top quark mass in the lepton plus jets final state with the matrix element method
Resumo:
We present a measurement of the top quark mass with the matrix element method in the lepton+jets final state. As the energy scale for calorimeter jets represents the dominant source of systematic uncertainty, the matrix element likelihood is extended by an additional parameter, which is defined as a global multiplicative factor applied to the standard energy scale. The top quark mass is obtained from a fit that yields the combined statistical and systematic jet energy scale uncertainty. Using a data set of 0.4 fb(-1) taken with the D0 experiment at Run II of the Fermilab Tevatron Collider, the mass of the top quark is measured using topological information to be: m(top)(center dot+jets)(topo)=169.2(-7.4)(+5.0)(stat+JES)(-1.4)(+1.5)(syst) GeV, and when information about identified b jets is included: m(top)(center dot+jets)(b-tag)=170.3(-4.5)(+4.1)(stat+ JES)(-1.8)(+1.2)(syst) GeV. The measurements yield a jet energy scale consistent with the reference scale.
Resumo:
A combined method for evaluating radon (Rn-222) and progeny (Pb-214 and Bi-214) in water was developed by using inexpensive alpha scintillation Counting and gamma ray spectrometry through NaI(Tl) scintillation detectors. A groundwater sample collected at the Pocos de Caldas alkaline massif in Brazil was submitted to the technique in order to assure its applicability by comparing the volumetric activities by different methods. Similar volumetric activity was determined for Pb-214 and Bi-214 in the sample analyzed that is compatible with the expected condition of radioactive equilibrium between these nuclides. The combined method was successfully used to analyze groundwater samples from Guarani aquifer in S (a) over tildeo Paulo State, Brazil, and the results of the measurements indicated that Pb-214 and Bi-214 provide useful information concerning the evaluation of the drinking water quality in terms of radiological aspects. This is because they are directly identified in the water samples, without the need of requiring the assumption of the establishment of the transient equilibrium condition with its parent Rn-222. (c) 2005 Elsevier Ltd. All rights reserved.