660 resultados para MATHEMATICAL PROGRAMS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biller-Andorno and Jüni (2014), in a widely debated commentary published in the May 22 issue of the New England Journal of Medicine, accept the concept that mammography every 2 years from age 50 can decrease breast cancer mortality by 20%, that is, from five to four deaths per 1000 women over a 10-year period. Both the absolute and the relative risk of breast cancer death may vary depending on the baseline mortality rates in various populations and on the impact of screening mammography in reducing breast cancer mortality, which may well vary around the 20% estimate adopted. We accept, therefore, that there are still uncertainties in the absolute and relative impact of mammography screening on breast cancer mortality, given the different study schemes and mammography intervals, the differences in populations, and the continuous improvements in technology (Warner, 2011; Independent UK Panel on Breast Cancer Screening, 2012). We also agree on the observation that mammography has an appreciable impact on breast cancer mortality (Bosetti et al., 2012), but clearly a much smaller one on total mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diplomityö tarkastelee säikeistettyä ohjelmointia rinnakkaisohjelmoinnin ylemmällä hierarkiatasolla tarkastellen erityisesti hypersäikeistysteknologiaa. Työssä tarkastellaan hypersäikeistyksen hyviä ja huonoja puolia sekä sen vaikutuksia rinnakkaisalgoritmeihin. Työn tavoitteena oli ymmärtää Intel Pentium 4 prosessorin hypersäikeistyksen toteutus ja mahdollistaa sen hyödyntäminen, missä se tuo suorituskyvyllistä etua. Työssä kerättiin ja analysoitiin suorituskykytietoa ajamalla suuri joukko suorituskykytestejä eri olosuhteissa (muistin käsittely, kääntäjän asetukset, ympäristömuuttujat...). Työssä tarkasteltiin kahdentyyppisiä algoritmeja: matriisioperaatioita ja lajittelua. Näissä sovelluksissa on säännöllinen muistinkäyttökuvio, mikä on kaksiteräinen miekka. Se on etu aritmeettis-loogisissa prosessoinnissa, mutta toisaalta huonontaa muistin suorituskykyä. Syynä siihen on nykyaikaisten prosessorien erittäin hyvä raaka suorituskyky säännöllistä dataa käsiteltäessä, mutta muistiarkkitehtuuria rajoittaa välimuistien koko ja useat puskurit. Kun ongelman koko ylittää tietyn rajan, todellinen suorituskyky voi pudota murto-osaan huippusuorituskyvystä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research examines the impacts of the Swiss reform of the allocation of tasks which was accepted in 2004 and implemented in 2008 to "re-assign" the responsibilities between the federal government and the cantons. The public tasks were redistributed, according to the leading and fundamental principle of subsidiarity. Seven tasks came under exclusive federal responsibility; ten came under the control of the cantons; and twenty-two "common tasks" were allocated to both the Confederation and the cantons. For these common tasks it wasn't possible to separate the management and the implementation. In order to deal with nineteen of them, the reform introduced the conventions-programs (CPs), which are public law contracts signed by the Confederation with each canton. These CPs are generally valid for periods of four years (2008-11, 2012-15 and 2016-19, respectively). The third period is currently being prepared. By using the principal-agent theory I examine how contracts can improve political relations between a principal (Confederation) and an agent (canton). I also provide a first qualitative analysis by examining the impacts of these contracts on the vertical cooperation and on the implication of different actors by focusing my study on five CPs - protection of cultural heritage and conservation of historic monuments, encouragement of the integration of foreigners, economic development, protection against noise and protection of the nature and landscape - applied in five cantons, which represents twenty-five cases studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we show how a nonlinear preprocessing of speech signal -with high noise- based on morphological filters improves the performance of robust algorithms for pitch tracking (RAPT). This result happens for a very simple morphological filter. More sophisticated ones could even improve such results. Mathematical morphology is widely used in image processing and has a great amount of applications. Almost all its formulations derived in the two-dimensional framework are easily reformulated to be adapted to one-dimensional context

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research was to do a repeated cross-sectional research on class teachers who study in the 4th year and also graduated at the Faculty of Education, University of Turku between the years of 2000 through 2004. Specifically, seven research questions were addressed to target the main purpose of the study: How do class teacher education masters’ degree senior students and graduates rate “importance; effectiveness; and quality” of training they have received at the Faculty of Education? Are there significant differences between overall ratings of importance; effectiveness and quality of training by year of graduation, sex, and age (for graduates) and sex and age (for senior students)? Is there significant relationship between respondents’ overall ratings of importance; effectiveness and their overall ratings of the quality of training and preparation they have received? Are there significant differences between graduates and senior students about importance, effectiveness, and quality of teacher education programs? And what do teachers’ [Graduates] believe about how increasing work experience has changed their opinions of their preservice training? Moreover the following concepts related to the instructional activities were studied: critical thinking skills, communication skills, attention to ethics, curriculum and instruction (planning), role of teacher and teaching knowledge, assessment skills, attention to continuous professional development, subject matters knowledge, knowledge of learning environment, and using educational technology. Researcher also tried to find influence of some moderator variables e.g. year of graduation, sex, and age on the dependent and independent variables. This study consisted of two questionnaires (a structured likert-scale and an open ended questionnaire). The population in study 1 was all senior students and 2000-2004 class teacher education masters’ degree from the departments of Teacher Education Faculty of Education at University of Turku. Of the 1020 students and graduates the researcher was able to find current addresses of 675 of the subjects and of the 675 graduates contacted, 439 or 66.2 percent responded to the survey. The population in study 2 was all class teachers who graduated from Turku University and now work in the few basic schools (59 Schools) in South- West Finland. 257 teachers answered to the open ended web-based questions. SPSS was used to produce standard deviations; Analysis of Variance; Pearson Product Moment Correlation (r); T-test; ANOVA, Bonferroni post-hoc test; and Polynomial Contrast tests meant to analyze linear trend. An alpha level of .05 was used to determine statistical significance. The results of the study showed that: A majority of the respondents (graduates and students) rated the overall importance, effectiveness and quality of the teacher education programs as important, effective and good. Generally speaking there were only a few significant differences between the cohorts and groups related to the background variables (gender, age). The different cohorts were rating the quality of the programs very similarly but some differences between the cohorts were found in the importance and effectiveness ratings. Graduates of 2001 and 2002 rated the importance of the program significantly higher than 2000 graduates. The effectiveness of the programs was rated significantly higher by 2001 and 2003 graduates than other groups. In spite of these individual differences between cohorts there were no linear trends among the year cohorts in any measure. In respondents’ ratings of the effectiveness of teacher education programs there was significant difference between males and females; females rated it higher than males. There were no significant differences between males’ and females’ ratings of the importance and quality of programs. In the ratings there was only one difference between age groups. Older graduates (35 years or older) rated the importance of the teacher training significantly higher that 25-35 years old graduates. In graduates’ ratings there were positive but relatively low correlations between all variables related to importance, effectiveness and quality of Teacher Education Programs. Generally speaking students’ ratings about importance, effectiveness and quality of teacher education program were very positive. There was only one significant difference related to the background variables. Females rated higher the effectiveness of the program. The comparison of students’ and graduates’ perception about importance, effectiveness, and quality of teacher education programs showed that there were no significant differences between graduates and students in the overall ratings. However there were differences in some individual variables. Students rated higher in importance of “Continuous Professional Development”, effectiveness of “Critical Thinking Skills” and “Using Educational Technology” and quality of “Advice received from the advisor”. Graduates rated higher in importance of “Knowledge of Learning Environment” and effectiveness of “Continuous Professional Development”. According to the qualitative data of study 2 some graduates expressed that their perceptions have not changed about the importance, effectiveness, and quality of training that they received during their study time. They pointed out that teacher education programs have provided them the basic theoretical/formal knowledge and some training of practical routines. However, a majority of the teachers seems to have somewhat critical opinions about the teacher education. These teachers were not satisfied with teacher education programs because they argued that the programs failed to meet their practical demands in different everyday situations of the classroom e.g. in coping with students’ learning difficulties, multiprofessional communication with parents and other professional groups (psychologists and social workers), and classroom management problems. Participants also emphasized more practice oriented knowledge of subject matter, evaluation methods and teachers’ rights and responsibilities. Therefore, they (54.1% of participants) suggested that teacher education departments should provide more practice-based courses and programs as well as closer collaboration between regular schools and teacher education departments in order to fill gap between theory and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process improvement is a common approach in increasing the effectiveness of an organization. It can be seen as an effort to increase coordination between units. Process improvement has proved to be challenging, and most management consultation firms facilitate organizations in this kind of initiatives. Cross-functional improvement is one of the main areas for internal consultants as well. However, the needs, challenges and means of cross-functional help have been rarely discussed in the literature. The objective of this thesis is on one hand to present a conceptual and descriptive framework to help understand the challenges of facilitating coordination improvement efforts in cross-functional improvement programs, and on the other hand to develop and test feasible solutions for some facilitation situations. The research questions are: 1. Why and in what kind of situations do organizations need help in developing coordination in cross-functional processes? 2. How can a facilitator help organizations in improving coordination to develop cross-functional processes? The study consists of two parts. The first part is an overview of the dissertation, and the second part comprises six research publications. The theoretical background for the study are the differentiation causing challenges in cross-functional settings, the coordination needed to improve processes, change management principles, methods and tools, and consultation practises. Three of the publications introduce tools for helping in developing prerequisites, planning responsibilities and supporting learning during the cross-functional program. The three other papers present frameworks to help understand and analyse the improvement situation. The main methodological approaches used in this study are design science research, action research and case research. The research data has been collected from ten cases representing different kinds of organizations, processes and developing situations. The data has been collected mainly by observation, semi-structured interviews and questionnaires. The research contributes to the rare literature combining coordination theories and process improvement practises. It also provides additional understanding of a holistic point of view in process improvement situations. The most important contribution is the addition to the theories of facilitating change in process improvement situations. From the managerial point of view, this study gives advice to managers and consultants in planning and executing cross-functional programs. The main factors increasing the need for facilitation are the challenges for differentiation, challenges of organizational change in general, and the novelty of initiatives and improvement practices concerning process development. Organizations need help in creating the prerequisites to change, in planning initiatives, easing conflict management and collaboration between groups, as well as supporting the learning of cross-functional improvement. The main challenges of facilitation are combining the different roles as a consultant, maintaining the ownership for the improvement project with the client, and supporting learning in the client organization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La tomodensitométrie (TDM) est une technique d'imagerie pour laquelle l'intérêt n'a cessé de croitre depuis son apparition au début des années 70. De nos jours, l'utilisation de cette technique est devenue incontournable, grâce entre autres à sa capacité à produire des images diagnostiques de haute qualité. Toutefois, et en dépit d'un bénéfice indiscutable sur la prise en charge des patients, l'augmentation importante du nombre d'examens TDM pratiqués soulève des questions sur l'effet potentiellement dangereux des rayonnements ionisants sur la population. Parmi ces effets néfastes, l'induction de cancers liés à l'exposition aux rayonnements ionisants reste l'un des risques majeurs. Afin que le rapport bénéfice-risques reste favorable au patient il est donc nécessaire de s'assurer que la dose délivrée permette de formuler le bon diagnostic tout en évitant d'avoir recours à des images dont la qualité est inutilement élevée. Ce processus d'optimisation, qui est une préoccupation importante pour les patients adultes, doit même devenir une priorité lorsque l'on examine des enfants ou des adolescents, en particulier lors d'études de suivi requérant plusieurs examens tout au long de leur vie. Enfants et jeunes adultes sont en effet beaucoup plus sensibles aux radiations du fait de leur métabolisme plus rapide que celui des adultes. De plus, les probabilités des évènements auxquels ils s'exposent sont également plus grandes du fait de leur plus longue espérance de vie. L'introduction des algorithmes de reconstruction itératifs, conçus pour réduire l'exposition des patients, est certainement l'une des plus grandes avancées en TDM, mais elle s'accompagne de certaines difficultés en ce qui concerne l'évaluation de la qualité des images produites. Le but de ce travail est de mettre en place une stratégie pour investiguer le potentiel des algorithmes itératifs vis-à-vis de la réduction de dose sans pour autant compromettre la qualité du diagnostic. La difficulté de cette tâche réside principalement dans le fait de disposer d'une méthode visant à évaluer la qualité d'image de façon pertinente d'un point de vue clinique. La première étape a consisté à caractériser la qualité d'image lors d'examen musculo-squelettique. Ce travail a été réalisé en étroite collaboration avec des radiologues pour s'assurer un choix pertinent de critères de qualité d'image. Une attention particulière a été portée au bruit et à la résolution des images reconstruites à l'aide d'algorithmes itératifs. L'analyse de ces paramètres a permis aux radiologues d'adapter leurs protocoles grâce à une possible estimation de la perte de qualité d'image liée à la réduction de dose. Notre travail nous a également permis d'investiguer la diminution de la détectabilité à bas contraste associée à une diminution de la dose ; difficulté majeure lorsque l'on pratique un examen dans la région abdominale. Sachant que des alternatives à la façon standard de caractériser la qualité d'image (métriques de l'espace Fourier) devaient être utilisées, nous nous sommes appuyés sur l'utilisation de modèles d'observateurs mathématiques. Nos paramètres expérimentaux ont ensuite permis de déterminer le type de modèle à utiliser. Les modèles idéaux ont été utilisés pour caractériser la qualité d'image lorsque des paramètres purement physiques concernant la détectabilité du signal devaient être estimés alors que les modèles anthropomorphes ont été utilisés dans des contextes cliniques où les résultats devaient être comparés à ceux d'observateurs humain, tirant profit des propriétés de ce type de modèles. Cette étude a confirmé que l'utilisation de modèles d'observateurs permettait d'évaluer la qualité d'image en utilisant une approche basée sur la tâche à effectuer, permettant ainsi d'établir un lien entre les physiciens médicaux et les radiologues. Nous avons également montré que les reconstructions itératives ont le potentiel de réduire la dose sans altérer la qualité du diagnostic. Parmi les différentes reconstructions itératives, celles de type « model-based » sont celles qui offrent le plus grand potentiel d'optimisation, puisque les images produites grâce à cette modalité conduisent à un diagnostic exact même lors d'acquisitions à très basse dose. Ce travail a également permis de clarifier le rôle du physicien médical en TDM: Les métriques standards restent utiles pour évaluer la conformité d'un appareil aux requis légaux, mais l'utilisation de modèles d'observateurs est inévitable pour optimiser les protocoles d'imagerie. -- Computed tomography (CT) is an imaging technique in which interest has been quickly growing since it began to be used in the 1970s. Today, it has become an extensively used modality because of its ability to produce accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase in the number of CT examinations performed has raised concerns about the potential negative effects of ionising radiation on the population. Among those negative effects, one of the major risks remaining is the development of cancers associated with exposure to diagnostic X-ray procedures. In order to ensure that the benefits-risk ratio still remains in favour of the patient, it is necessary to make sure that the delivered dose leads to the proper diagnosis without producing unnecessarily high-quality images. This optimisation scheme is already an important concern for adult patients, but it must become an even greater priority when examinations are performed on children or young adults, in particular with follow-up studies which require several CT procedures over the patient's life. Indeed, children and young adults are more sensitive to radiation due to their faster metabolism. In addition, harmful consequences have a higher probability to occur because of a younger patient's longer life expectancy. The recent introduction of iterative reconstruction algorithms, which were designed to substantially reduce dose, is certainly a major achievement in CT evolution, but it has also created difficulties in the quality assessment of the images produced using those algorithms. The goal of the present work was to propose a strategy to investigate the potential of iterative reconstructions to reduce dose without compromising the ability to answer the diagnostic questions. The major difficulty entails disposing a clinically relevant way to estimate image quality. To ensure the choice of pertinent image quality criteria this work was continuously performed in close collaboration with radiologists. The work began by tackling the way to characterise image quality when dealing with musculo-skeletal examinations. We focused, in particular, on image noise and spatial resolution behaviours when iterative image reconstruction was used. The analyses of the physical parameters allowed radiologists to adapt their image acquisition and reconstruction protocols while knowing what loss of image quality to expect. This work also dealt with the loss of low-contrast detectability associated with dose reduction, something which is a major concern when dealing with patient dose reduction in abdominal investigations. Knowing that alternative ways had to be used to assess image quality rather than classical Fourier-space metrics, we focused on the use of mathematical model observers. Our experimental parameters determined the type of model to use. Ideal model observers were applied to characterise image quality when purely objective results about the signal detectability were researched, whereas anthropomorphic model observers were used in a more clinical context, when the results had to be compared with the eye of a radiologist thus taking advantage of their incorporation of human visual system elements. This work confirmed that the use of model observers makes it possible to assess image quality using a task-based approach, which, in turn, establishes a bridge between medical physicists and radiologists. It also demonstrated that statistical iterative reconstructions have the potential to reduce the delivered dose without impairing the quality of the diagnosis. Among the different types of iterative reconstructions, model-based ones offer the greatest potential, since images produced using this modality can still lead to an accurate diagnosis even when acquired at very low dose. This work has clarified the role of medical physicists when dealing with CT imaging. The use of the standard metrics used in the field of CT imaging remains quite important when dealing with the assessment of unit compliance to legal requirements, but the use of a model observer is the way to go when dealing with the optimisation of the imaging protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El problema de la regresión simbólica consiste en el aprendizaje, a partir de un conjunto muestra de datos obtenidos experimentalmente, de una función desconocida. Los métodos evolutivos han demostrado su eficiencia en la resolución de instancias de dicho problema. En este proyecto se propone una nueva estrategia evolutiva, a través de algoritmos genéticos, basada en una nueva estructura de datos denominada Straight Line Program (SLP) y que representa en este caso expresiones simbólicas. A partir de un SLP universal, que depende de una serie de parámetros cuya especialización proporciona SLP's concretos del espacio de búsqueda, la estrategia trata de encontrar los parámetros óptimos para que el SLP universal represente la función que mejor se aproxime al conjunto de puntos muestra. De manera conceptual, este proyecto consiste en un entrenamiento genético del SLP universal, utilizando los puntos muestra como conjunto de entrenamiento, para resolver el problema de la regresión simbólica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coating and filler pigments have strong influence to the properties of the paper. Filler content can be even over 30 % and pigment content in coating is about 85-95 weight percent. The physical and chemical properties of the pigments are different and the knowledge of these properties is important for optimising of optical and printing properties of the paper. The size and shape of pigment particles can be measured by different analysers which can be based on sedimentation, laser diffraction, changes in electric field etc. In this master's thesis was researched particle properties especially by scanning electron microscope (SEM) and image analysis programs. Research included nine pigments with different particle size and shape. Pigments were analysed by two image analysis programs (INCA Feature and Poikki), Coulter LS230 (laser diffraction) and SediGraph 5100 (sedimentation). The results were compared to perceive the effect of particle shape to the performance of the analysers. Only image analysis programs gave parameters of the particle shape. One part of research was also the sample preparation for SEM. Individual particles should be separated and distinct in ideal sample. Analysing methods gave different results but results from image analysis programs corresponded even to sedimentation or to laser diffraction depending on the particle shape. Detailed analysis of the particle shape required high magnification in SEM, but measured parameters described very well the shape of the particles. Large particles (ecd~1 µm) could be used also in 3D-modelling which enabled the measurement of the thickness of the particles. Scanning electron microscope and image analysis programs were effective and multifunctional tools for particle analyses. Development and experience will devise the usability of analysing method in routine use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural Networks are a set of mathematical methods and computer programs designed to simulate the information process and the knowledge acquisition of the human brain. In last years its application in chemistry is increasing significantly, due the special characteristics for model complex systems. The basic principles of two types of neural networks, the multi-layer perceptrons and radial basis functions, are introduced, as well as, a pruning approach to architecture optimization. Two analytical applications based on near infrared spectroscopy are presented, the first one for determination of nitrogen content in wheat leaves using multi-layer perceptrons networks and second one for determination of BRIX in sugar cane juices using radial basis functions networks.