954 resultados para Suppliers selection problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Finite Element Method based forward solver is developed for solving the forward problem of a 2D-Electrical Impedance Tomography. The Method of Weighted Residual technique with a Galerkin approach is used for the FEM formulation of EIT forward problem. The algorithm is written in MatLAB7.0 and the forward problem is studied with a practical biological phantom developed. EIT governing equation is numerically solved to calculate the surface potentials at the phantom boundary for a uniform conductivity. An EIT-phantom is developed with an array of 16 electrodes placed on the inner surface of the phantom tank filled with KCl solution. A sinusoidal current is injected through the current electrodes and the differential potentials across the voltage electrodes are measured. Measured data is compared with the differential potential calculated for known current and solution conductivity. Comparing measured voltage with the calculated data it is attempted to find the sources of errors to improve data quality for better image reconstruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study sought to assess the extent to which the entry characteristics of students in a graduate-entry medical programme predict the subsequent development of clinical reasoning ability. Subjects comprised 290 students voluntarily recruited from three successive cohorts of the University of Queensland's MBBS Programme. Clinical reasoning was measured once a year over a period of three years using two methods, a set of 10 Clinical Reasoning Problems (CRPs) and the Diagnostic Thinking Inventory (DTI). Data on gender, age at entry into the programme, nature of primary degree, scores on selection criteria (written examination plus interview) and academic performance in the first two years of the programme were recorded for each student, and their association with clinical reasoning skill analysed using univariate and multivariate analysis. Univariate analysis indicated significant associations between CRP score, gender and primary degree with a significant but small association between DTI and interview score. Stage of progression through the programme was also an important predictor of performance on both indicators. Subsequent multivariate analysis suggested that female gender is a positive predictor of CRP score independently of the nature of a subject's primary degree and stage of progression through the programme, although these latter two variables are interdependent. Positive predictors of clinical reasoning skill are stage of progression through the MBBS programme, female gender and interview score. Although the nature of a student's primary degree is important in the early years of the programme, evidence suggests that by graduation differences between students' clinical reasoning skill due to this factor have been resolved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing numbers of medical schools in Australia and overseas have moved away from didactic teaching methodologies and embraced problem-based learning (PBL) to improve clinical reasoning skills and communication skills as well as to encourage self-directed lifelong learning. In January 2005, the first cohort of students entered the new MBBS program at the Griffith University School of Medicine, Gold Coast, to embark upon an exciting, fully integrated curriculum using PBL, combining electronic delivery, communication and evaluation systems incorporating cognitive principles that underpin the PBL process. This chapter examines the educational philosophies and design of the e-learning environment underpinning the processes developed to deliver, monitor and evaluate the curriculum. Key initiatives taken to promote student engagement and innovative and distinctive approaches to student learning at Griffith promoted within the conceptual model for the curriculum are (a) Student engagement, (b) Pastoral care, (c) Staff engagement, (d) Monitoring and (e) Curriculum/Program Review. © 2007 Springer-Verlag Berlin Heidelberg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we investigate the qualitative and quantitative effects of an R&D subsidy for a clean technology and a Pigouvian tax on a dirty technology on environmental R&D when it is uncertain how long the research takes to complete. The model is formulated as an optimal stopping problem, in which the number of successes required to complete the R&D project is finite and learning about the probability of success is incorporated. We show that the optimal R&D subsidy with the consideration of learning is higher than that without it. We also find that an R&D subsidy performs better than a Pigouvian tax unless suppliers have sufficient incentives to continue cost-reduction efforts after the new technology success-fully replaces the old one. Moreover, by using a two-project model, we show that a uniform subsidy is better than a selective subsidy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to analyse education, employment, and work-life experiences of visually impaired persons in expert jobs. The empirical data consists of 30 thematic interviews (24 visually impaired persons, 1 family-member of a visually impaired person, 5 persons working with diversity issues), of supplementary articles, and of statistics on the socio-economic status of the visually impaired. The interviewees experiences of education and employment have been analysed by a qualitative method. The analysis has been deepened by reflecting it against the recent discussion on the concept of diversity. The author s methodological choice as a disability researcher has been to treat the interviewees as co-researchers rather than objects of research. Accessibility in its different forms is a prerequisite of diversity in the workplace, and this study examines what kind of accessibility is required by visually impaired professionals. Access to working life depends on the attitudes prejudices and expectations that society has towards a minority group. Social accessibility is connected with internal relationships in the workplace, and achieving social accessibility is a bilateral process. Information technology has revolutionised the visually impaired people s possibilities of accessing information and performing expert tasks. Accessible environment, good mobility skills, and transportation services enable visually impaired employees to get to their workplaces and to navigate there with ease. Integration has raised the level of education and widened the selection of career options for the visually impaired. However, even visually impaired people with academic degrees often need employment support services. Visually impaired professionals are mainly employed in the public and third sector. Achieving diversity in the labour market is a multiactor process. Social support services are needed, as well as courage and readiness from employers to hire people with disabilities. The organisations of the visually impaired play an important role in affecting the attitudes and providing peer support. Visually impaired employees need good professional skills, blindness skills, and social courage, and they need to be comfortable with their disability. In the workplace, diversity may actualise as diverse ways of working: the work is done by using technical aids or other means of compensating for the lack of eyesight. When an employee must find compensatory solutions for disability-related limitations at work, this will also develop his/her problem-solving abilities. Key words: visually impaired, diversity, accessibility, working life

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the past two decades, the selection, optimization, and compensation (SOC) model has been applied in the work context to investigate antecedents and outcomes of employees' use of action regulation strategies. We systematically review, meta-analyze, and critically discuss the literature on SOC strategy use at work and outline directions for future research and practice. The systematic review illustrates the breadth of constructs that have been studied in relation to SOC strategy use, and that SOC strategy use can mediate and moderate relationships of person and contextual antecedents with work outcomes. Results of the meta-analysis show that SOC strategy use is positively related to age (rc = .04), job autonomy (rc = .17), self-reported job performance (rc = .23), non-self-reported job performance (rc = .21), job satisfaction (rc = .25), and job engagement (rc = .38), whereas SOC strategy use is not significantly related to job tenure, job demands, and job strain. Overall, our findings underline the importance of the SOC model for the work context, and they also suggest that its measurement and reporting standards need to be improved to become a reliable guide for future research and organizational practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a concept for a collision avoidance system for ships, which is based on model predictive control. A finite set of alternative control behaviors are generated by varying two parameters: offsets to the guidance course angle commanded to the autopilot and changes to the propulsion command ranging from nominal speed to full reverse. Using simulated predictions of the trajectories of the obstacles and ship, compliance with the Convention on the International Regulations for Preventing Collisions at Sea and collision hazards associated with each of the alternative control behaviors are evaluated on a finite prediction horizon, and the optimal control behavior is selected. Robustness to sensing error, predicted obstacle behavior, and environmental conditions can be ensured by evaluating multiple scenarios for each control behavior. The method is conceptually and computationally simple and yet quite versatile as it can account for the dynamics of the ship, the dynamics of the steering and propulsion system, forces due to wind and ocean current, and any number of obstacles. Simulations show that the method is effective and can manage complex scenarios with multiple dynamic obstacles and uncertainty associated with sensors and predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation considers the problem of trust in the context of food consumption. The research perspectives refer to institutional conditions for consumer trust, personal practices of food consumption, and strategies consumers employ for controlling the safety of their food. The main concern of the study is to investigate consumer trust as an adequate response to food risks, i.e. a strategy helping the consumer to make safe choices in an uncertain food situation. "Risky" perspective serves as a frame of reference for understanding and explaining trust relations. The original aim of the study was to reveal the meanings applied to the concepts of trust, safety and risks in the perspective of market choices, the assessments of food risks and the ways of handling them. Supplementary research tasks presumed descriptions of institutional conditions for consumer trust, including descriptions of the food market, and the presentation of food consumption patterns in St. Petersburg. The main empirical material is based on qualitative interviews with consumers and interviews and group discussions with professional experts (market actors, representatives of inspection bodies and consumer organizations). Secondary material is used for describing institutional conditions for consumer trust and the market situation. The results suggest that the idea of consumer trust is associated with the reputation of suppliers, stable quality and taste of their products, and reliable food information. Being a subjectively constructed state connected to the act of acceptance, consumer trust results in positive buying decisions and stable preferences in the food market. The consumers' strategies that aim at safe food choices refer to repetitive interactions with reliable market actors that free them from constant consideration in the marketplace. Trust in food is highly mediated by trust in institutions involved in the food system. The analysis reveals a clear pattern of disbelief in the efficiency of institutional food control. The study analyses this as a reflection of "total distrust" that appears to be a dominant mood in many contexts of modern Russia. However, the interviewees emphasize the state's decisive role in suppressing risks in the food market. Also, the findings are discussed with reference to the consumers' possibilities of personal control over food risks. Three main responses to a risky food situation are identified: the reflexive approach, the traditional approach, and the fatalistic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Guo and Nixon proposed a feature selection method based on maximizing I(x; Y),the multidimensional mutual information between feature vector x and class variable Y. Because computing I(x; Y) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y) as the criterion for feature selection. We show that Guo and Nixon's criterion originates from approximating the joint probability distributions in I(x; Y) by second-order product distributions. We remark on the limitations of the approximation and discuss computationally attractive alternatives to compute I(x; Y).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that the numerical accuracy of a series solution to a boundary-value problem by the direct method depends on the technique of approximate satisfaction of the boundary conditions and on the stage of truncation of the series. On the other hand, it does not appear to be generally recognized that, when the boundary conditions can be described in alternative equivalent forms, the convergence of the solution is significantly affected by the actual form in which they are stated. The importance of the last aspect is studied for three different techniques of computing the deflections of simply supported regular polygonal plates under uniform pressure. It is also shown that it is sometimes possible to modify the technique of analysis to make the accuracy independent of the description of the boundary conditions.