887 resultados para Level-Set Method
Resumo:
Tutkielman tavoite: Tutkielman tavoitteena on kuvata ja arvioida erään organisaation muutosprosessia, jonka seurauksena virastomallista organisaatiota ollaan muuttamassa paremmin kilpailuille markkinoille sopivaan liikelaitosmuotoon. Tutkielmassa selvitetään, miten siirtyminen budjettitaloudesta ansaintatalouteen vaikuttaa tutkittavan organisaation johtamiseen, henkilöstön asenteisiin ja työtapoihin sekä koko organisaation toimivuuteen. Tutkielman ongelma on muotoiltu seuraavasti: Mikä motivoi ja sitouttaa virastoaikana työssä ollutta henkilöstöä muutokseen, jossa ”turvallisesta” virastomallista ollaan siirtymässä asteittain kilpailtujen markkinoiden epävarmuuteen? Tutkimusmenetelmä: Tutkielman tutkimusstrategiaksi valittiin kvantitatiivinen survey –tutkimus ja aineistonkeruumenetelmäksi lomakekysely. Tutkielman teoreettinen viitekehys muodostuu aikaisemmista tutkimuksista, jotka käsittelevät: muutoksen vaikutuksia liiketoimintaan, muutokseen sitoutumisista, muutoksen johtamista, yksilön asemaa muutoksessa sekä muutosta julkisella sektorilla. Tutkielman tulokset ja johtopäätökset: Tutkittavan organisaation virastoaikainen henkilöstö on tämän tutkielman perusteella erittäin sitoutunut organisaatioonsa ja erittäin motivoitunut tekemään työtä organisaation menestyksen eteen myös tulevaisuudessa. Eniten virastoaikaista henkilöstöä motivoi tekemään työtä uusissa olosuhteissa mahdollisuus oppia uusia asioita sekä onnistuneesti tehdyn työn tuottama hyvän olon tunne. Tutkielmassa myös selvisi, että organisaatioon sitoutumisen aste sekä suhtautuminen muutokseen ovat suoraan verrannollisia organisaatioasemaan. Mitä korkeammassa asemassa henkilö organisaatiossa on, sitä sitoutuneempi hän organisaatioon on ja sitä positiivisemmin hän muutokseen suhtautuu. Organisaatiomuutos on tutkittavassa organisaatiossa ollut vaikea prosessi ja se on vaikuttanut jokaiseen organisaation yksilöön. Henkilöstön työmäärä, vaatimukset työtä kohtaan, sisäinen kilpailu, kiire ja näistä kaikista johtuen myös stressi ovat lisääntyneet virastoajasta. Myös henkilöstöresursseja on muutoksen seurauksena jouduttu karsimaan. Kokonaisuudessaan organisaatiomuutos on henkilöstön mielestä ollut melko positiivinen asia ja sen on nähty parantaneen organisaation toiminnan tehokkuutta.
Resumo:
Introduction. Genetic epidemiology is focused on the study of the genetic causes that determine health and diseases in populations. To achieve this goal a common strategy is to explore differences in genetic variability between diseased and nondiseased individuals. Usual markers of genetic variability are single nucleotide polymorphisms (SNPs) which are changes in just one base in the genome. The usual statistical approach in genetic epidemiology study is a marginal analysis, where each SNP is analyzed separately for association with the phenotype. Motivation. It has been observed, that for common diseases the single-SNP analysis is not very powerful for detecting genetic causing variants. In this work, we consider Gene Set Analysis (GSA) as an alternative to standard marginal association approaches. GSA aims to assess the overall association of a set of genetic variants with a phenotype and has the potential to detect subtle effects of variants in a gene or a pathway that might be missed when assessed individually. Objective. We present a new optimized implementation of a pair of gene set analysis methodologies for analyze the individual evidence of SNPs in biological pathways. We perform a simulation study for exploring the power of the proposed methodologies in a set of scenarios with different number of causal SNPs under different effect sizes. In addition, we compare the results with the usual single-SNP analysis method. Moreover, we show the advantage of using the proposed gene set approaches in the context of an Alzheimer disease case-control study where we explore the Reelin signal pathway.
Resumo:
The semiclassical Wigner-Kirkwood ̄h expansion method is used to calculate shell corrections for spherical and deformed nuclei. The expansion is carried out up to fourth order in ̄h. A systematic study of Wigner-Kirkwood averaged energies is presented as a function of the deformation degrees of freedom. The shell corrections, along with the pairing energies obtained by using the Lipkin-Nogami scheme, are used in the microscopic-macroscopic approach to calculate binding energies. The macroscopic part is obtained from a liquid drop formula with six adjustable parameters. Considering a set of 367 spherical nuclei, the liquid drop parameters are adjusted to reproduce the experimental binding energies, which yields a root mean square (rms) deviation of 630 keV. It is shown that the proposed approach is indeed promising for the prediction of nuclear masses.
Resumo:
Very large molecular systems can be calculated with the so called CNDOL approximate Hamiltonians that have been developed by avoiding oversimplifications and only using a priori parameters and formulas from the simpler NDO methods. A new diagonal monoelectronic term named CNDOL/21 shows great consistency and easier SCF convergence when used together with an appropriate function for charge repulsion energies that is derived from traditional formulas. It is possible to obtain a priori molecular orbitals and electron excitation properties after the configuration interaction of single excited determinants with reliability, maintaining interpretative possibilities even being a simplified Hamiltonian. Tests with some unequivocal gas phase maxima of simple molecules (benzene, furfural, acetaldehyde, hexyl alcohol, methyl amine, 2,5 dimethyl 2,4 hexadiene, and ethyl sulfide) ratify the general quality of this approach in comparison with other methods. The calculation of large systems as porphine in gas phase and a model of the complete retinal binding pocket in rhodopsin with 622 basis functions on 280 atoms at the quantum mechanical level show reliability leading to a resulting first allowed transition in 483 nm, very similar to the known experimental value of 500 nm of "dark state." In this very important case, our model gives a central role in this excitation to a charge transfer from the neighboring Glu(-) counterion to the retinaldehyde polyene chain. Tests with gas phase maxima of some important molecules corroborate the reliability of CNDOL/2 Hamiltonians.
Resumo:
As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.
Resumo:
This paper reports the method development for the simultaneous determination of methylmercury MeHgþ) and inorganic mercury (iHg) species in seafood samples. The study focused on the extraction and quantification of MeHgþ (the most toxic species) by liquid chromatography coupled to on-line UV irradiation and cold vapour atomic fluorescence spectroscopy (LC-UV-CV-AFS), using HCl 4 mol/L as the extractant agent. Accuracy of the method has been verified by analysing three certified reference materials and different spiked samples. The values found for total Hg and MeHgþ for the CRMs did not differ significantly from certified values at a 95% confidence level, and recoveries between 85% and 97% for MeHgþ, based on spikes, were achieved. The detection limits (LODs) obtained were 0.001 mg Hg/kg for total mercury, 0.0003 mg Hg/kg for MeHgþ and 0.0004 mg Hg/kg for iHg. The quantification limits (LOQs) established were 0.003 mg Hg/kg for total mercury, 0.0010 mg Hg/kg for MeHgþ and 0.0012 mg Hg/kg for iHg. Precision for each mercury species was established, being 12% in terms of RSD in all cases. Finally, the developed method was applied to 24 seafood samples from different origins and total mercury contents. The concentrations for Total Hg, MeHg and iHg ranged from 0.07 to 2.33, 0.003-2.23 and 0.006-0.085 mg Hg/kg, respectively. The established analytical method allows to obtain results for mercury speciation in less than 1 one hour including both, sample pretreatment and measuring step.
Resumo:
The kinematics of the anatomical shoulder are analysed and modelled as a parallel mechanism similar to a Stewart platform. A new method is proposed to describe the shoulder kinematics with minimal coordinates and solve the indeterminacy. The minimal coordinates are defined from bony landmarks and the scapulothoracic kinematic constraints. Independent from one another, they uniquely characterise the shoulder motion. A humanoid mechanism is then proposed with identical kinematic properties. It is then shown how minimal coordinates can be obtained for this mechanism and how the coordinates simplify both the motion-planning task and trajectory-tracking control. Lastly, the coordinates are also shown to have an application in the field of biomechanics where they can be used to model the scapulohumeral rhythm.
Resumo:
Este trabajo se inscribe en el ámbito de la prevención de la negligencia parental. Basado en un proceso teórico-práctico, el programa de intervención está sujeto a la línea de los programas de formación de padres, por lo que sigue una dinámica psico-educativa y comunitaria. Para la realización del trabajo se ha partido de la idea que la familia es la base del desarrollo personal de los humanos y que el buen desarrollo de los menores dependerá en gran parte de las relaciones intrafamiliares. Se establece como objetivo principal ofrecer un servicio específico, estable y continuo, vinculado a los servicios de atención a la infancia y a la familia, que trabaje con el fin de conseguir un cambio conductual en aquellas familias en las que se presente una dinámica parental negligente. La metodología utilizada ha combinado diferentes técnicas de investigación. Para el trabajo de documentación se ha realizado una búsqueda bibliográfica alrededor del concepto de negligencia parental, se han analizado las acciones que se llevan a cabo a nivel institucional y se ha establecido el marco conceptual en el que se incluye tanto los aspectos legal de las diferentes áreas de la administración como aquellos conceptos que forman la estructura de una intervención en el ámbito de la infancia y la familia. También se ha tenido en cuenta la comunicación directa con algunos profesionales del ámbito social, sanitario y educativo como parte importante y determinante del proceso de ejecución del programa. El diseño del programa sigue la metodología del planeamiento estratégico e incluye un diagnóstico preliminar, un plan de acción detallado de las diferentes fases de implementación de la propuesta de intervención, la previsión de mecanismos de evaluación y un presupuesto detallado. En la primera parte del trabajo se refleja la gravedad y el impacto que tiene la negligencia parental en nuestra sociedad, observando la evolución histórica del concepto y la visualización de la problemática a la que va asociado. También se expone la necesidad de crear programas destinados a trabajar esta problemática y una revisión del marco legal que regula la atención a la infancia por parte de las administraciones públicas. En una segunda parte, se propone un programa específico destinado a trabajar la negligencia parental desde una perspectiva de reeducación y cambio conductual. Este proyecto de intervención se ubica en el barrio de Can Rull en Sabadell, del cual se detallan sus especificidades socioeconómicas y su realidad institucional.
Resumo:
Peer-reviewed
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
Abstract Objective: To perform a comparative dosimetric analysis, based on computer simulations, of temporary balloon implants with 99mTc and balloon brachytherapy with high-dose-rate (HDR) 192Ir, as boosts to radiotherapy. We hypothesized that the two techniques would produce equivalent doses under pre-established conditions of activity and exposure time. Materials and Methods: Simulations of implants with 99mTc-filled and HDR 192Ir-filled balloons were performed with the Siscodes/MCNP5, modeling in voxels a magnetic resonance imaging set related to a young female. Spatial dose rate distributions were determined. In the dosimetric analysis of the protocols, the exposure time and the level of activity required were specified. Results: The 99mTc balloon presented a weighted dose rate in the tumor bed of 0.428 cGy.h-1.mCi-1 and 0.190 cGyh-1.mCi-1 at the balloon surface and at 8-10 mm from the surface, respectively, compared with 0.499 and 0.150 cGyh-1.mCi-1, respectively, for the HDR 192Ir balloon. An exposure time of 24 hours was required for the 99mTc balloon to produce a boost of 10.14 Gy with 1.0 Ci, whereas only 24 minutes with 10.0 Ci segments were required for the HDR 192Ir balloon to produce a boost of 5.14 Gy at the same reference point, or 10.28 Gy in two 24-minutes fractions. Conclusion: Temporary 99mTc balloon implantation is an attractive option for adjuvant radiotherapy in breast cancer, because of its availability, economic viability, and similar dosimetry in comparison with the use of HDR 192Ir balloon implantation, which is the current standard in clinical practice.
Resumo:
This paper presents a first analysis on local electronic participatory experiences in Catalonia. The analysis is based on a database constructed and collected by the authors. The paper carries out an explanatory analysis of local initiatives in eparticipationand off line participation taking into account political variables (usually not considered in this kind of analysis) but also classical socio-economic variables that characterise municipalities. Hence, we add a quantitative analysis to the numerous case studies on local e-participation experiences. We have chosen Catalonia because is one of the European regions with more initiatives and one that has enjoyed considerable local governmental support to citizen participation initiatives since the 80s. The paper offers a characterisation of these experiences and a first explanatory analysis, considering: i) the institutional context in which these experiences are embedded, ii) the characteristics of the citizen participation processes and mechanisms on-line, and iii) a set of explanatory variables composed by the population size, thepolitical adscription of the mayor, the electoral abstention rate, age, income and level ofeducation in the municipality. The model that we present is explanatory for the municipalities with more than 20,000 inhabitants but it is not for the fewer than 20,000inhabitants. Actually, the number of participatory activities developed by these last municipalities is very low. Among all the variables, population size becomes the mostinfluential variable. Political variables such as political party of the mayor and the localabstention rate have a certain influence but that have to be controlled by population size.
Resumo:
This article presents an analysis on local participatory experiences in Catalonia,both online and in-person. The analysis is based on a database set up by theauthors. The article carries out an explanatory analysis of local participatoryinitiatives (on- and offline) taking into account political variables (not usually con-sidered in this kind of analysis) and also classical socio-economic variables thatcharacterize municipalities. Hence, we add a quantitative analysis to the numerouscase studies on local e-participation experiences. We have chosen Catalonia becauseit is one of the European regions with more initiatives and a considerable localgovernment support for citizen participation initiatives since the 1980s. Thearticle offers a characterization of these experiences and an explanatory analysis,considering: (i) the institutional context in which these experiences are embedded,(ii) the citizen participation processes and mechanisms online and (iii) a set ofexplanatory variables composed of the population size and the province to whichthe municipality belongs, the political tendency of the mayor, the electoral absten-tion rate, age, income, level of education, broadband connection and users of theInternet in the municipality. The model that we present is explanatory for munici-palities with more than 20,000 inhabitants but it is not for fewer than 20,000inhabitants. Actually, the majority of these latter municipalities have not developedany participatory activities. Among all the variables, population size is the mostinfluential variable and affects the influence of other variables, such as the politicalparty of the mayor, the local abstention rate and the province.
Resumo:
This thesis deals with a hardware accelerated Java virtual machine, named REALJava. The REALJava virtual machine is targeted for resource constrained embedded systems. The goal is to attain increased computational performance with reduced power consumption. While these objectives are often seen as trade-offs, in this context both of them can be attained simultaneously by using dedicated hardware. The target level of the computational performance of the REALJava virtual machine is initially set to be as fast as the currently available full custom ASIC Java processors. As a secondary goal all of the components of the virtual machine are designed so that the resulting system can be scaled to support multiple co-processor cores. The virtual machine is designed using the hardware/software co-design paradigm. The partitioning between the two domains is flexible, allowing customizations to the resulting system, for instance the floating point support can be omitted from the hardware in order to decrease the size of the co-processor core. The communication between the hardware and the software domains is encapsulated into modules. This allows the REALJava virtual machine to be easily integrated into any system, simply by redesigning the communication modules. Besides the virtual machine and the related co-processor architecture, several performance enhancing techniques are presented. These include techniques related to instruction folding, stack handling, method invocation, constant loading and control in time domain. The REALJava virtual machine is prototyped using three different FPGA platforms. The original pipeline structure is modified to suit the FPGA environment. The performance of the resulting Java virtual machine is evaluated against existing Java solutions in the embedded systems field. The results show that the goals are attained, both in terms of computational performance and power consumption. Especially the computational performance is evaluated thoroughly, and the results show that the REALJava is more than twice as fast as the fastest full custom ASIC Java processor. In addition to standard Java virtual machine benchmarks, several new Java applications are designed to both verify the results and broaden the spectrum of the tests.
Resumo:
Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.