978 resultados para empirical assessment
Resumo:
This Ph.D. dissertation seeks to study the work motivation of employees in the delivery of public services. The questioning on work motivation in public services in not new but it becomes central for governments which are now facing unprecedented public debts. The objective of this research is twofold : First, we want to see if the work motivation of employees in public services is a continuum (intrinsic and extrinsic motivations cannot coexist) or a bi-dimensional construct (intrinsic and extrinsic motivations coexist simultaneously). The research in public administration literature has focused on the concept of public service motivation, and considered motivation to be uni-dimensional (Perry and Hondeghem 2008). However, no study has yet tackled both types of motivation, the intrinsic and extrinsic ones, in the same time. This dissertation proposes, in Part I, a theoretical assessment and an empirical test of a global work motivational structure, by using a self-constructed Swiss dataset with employees from three public services, the education sector, the security sector and the public administrative services sector. Our findings suggest that work motivation in public services in not uni-dimensional but bi-dimensional, the intrinsic and extrinsic motivations coexist simultaneously and can be positively correlated (Amabile et al. 1994). Our findings show that intrinsic motivation is as important as extrinsic motivation, thus, the assumption that employees in public services are less attracted by extrinsic rewards is not confirmed for this sample. Other important finding concerns the public service motivation concept, which, as theoretically predicted, represents the major motivational dimension of employees in the delivery of public services. Second, the theory of public service motivation makes the assumption that employees in public services engage in activities that go beyond their self-interest, but never uses this construct as a determinant for their pro-social behavior. In the same time, several studies (Gregg et al. 2011 and Georgellis et al. 2011) bring evidence about the pro-social behavior of employees in public services. However, they do not identify which type of motivation is at the origin of this behavior, they only make the assumption of an intrinsically motivated behavior. We analyze the pro-social behavior of employees in public services and use the public service motivation as determinant of their pro-social behavior. We add other determinants highlighted by the theory of pro-social behavior (Bénabou and Tirole 2006), by Le Grand (2003) and by fit theories (Besley and Ghatak 2005). We test these determinants on Part II and identify for each sector of activity the positive or the negative impact on pro-social behavior of Swiss employees. Contrary to expectations, we find, for this sample, that both intrinsic and extrinsic factors have a positive impact on pro-social behavior, no crowding-out effect is identified in this sample. We confirm the hypothesis of Le Grand (2003) about the positive impact of the opportunity cost on pro-social behavior. Our results suggest a mix of action-oriented altruism and out-put oriented altruism of employees in public services. These results are relevant when designing incentives schemes for employees in the delivery of public services.
Resumo:
Knowledge of the soil water retention curve (SWRC) is essential for understanding and modeling hydraulic processes in the soil. However, direct determination of the SWRC is time consuming and costly. In addition, it requires a large number of samples, due to the high spatial and temporal variability of soil hydraulic properties. An alternative is the use of models, called pedotransfer functions (PTFs), which estimate the SWRC from easy-to-measure properties. The aim of this paper was to test the accuracy of 16 point or parametric PTFs reported in the literature on different soils from the south and southeast of the State of Pará, Brazil. The PTFs tested were proposed by Pidgeon (1972), Lal (1979), Aina & Periaswamy (1985), Arruda et al. (1987), Dijkerman (1988), Vereecken et al. (1989), Batjes (1996), van den Berg et al. (1997), Tomasella et al. (2000), Hodnett & Tomasella (2002), Oliveira et al. (2002), and Barros (2010). We used a database that includes soil texture (sand, silt, and clay), bulk density, soil organic carbon, soil pH, cation exchange capacity, and the SWRC. Most of the PTFs tested did not show good performance in estimating the SWRC. The parametric PTFs, however, performed better than the point PTFs in assessing the SWRC in the tested region. Among the parametric PTFs, those proposed by Tomasella et al. (2000) achieved the best accuracy in estimating the empirical parameters of the van Genuchten (1980) model, especially when tested in the top soil layer.
Resumo:
Invasive fungal infections are frequent and severe complications in leukaemic patients with prolonged neutropaenia. Empirical antifungal therapy has become the standard of care in patients with persistent fever despite treatment with broad-spectrum antibiotics. For decades amphotericin B deoxycholate has been the sole option for empirical antifungal therapy. Recently, several new antifungal agents became available. The choice of the most appropriate drug should be guided by efficacy and safety criteria. The recommendations from the First European Conference on Infections in Leukaemia (ECIL-1) on empirical antifungal therapy in neutropaenic cancer patients with persistent fever have been developed by an expert panel after assessment of clinical practices in Europe and evidence-based review of the literature. Many antifungal regimens can now be recommended for empirical therapy in neutropaenic cancer patients. However, persistent fever lacks specificity for initiation of therapy. Development of empirical and pre-emptive strategies using new clinical parameters, laboratory markers and imaging techniques for early diagnosis of invasive mycoses are needed.
Resumo:
Debris flows are among the most dangerous processes in mountainous areas due to their rapid rate of movement and long runout zone. Sudden and rather unexpected impacts produce not only damages to buildings and infrastructure but also threaten human lives. Medium- to regional-scale susceptibility analyses allow the identification of the most endangered areas and suggest where further detailed studies have to be carried out. Since data availability for larger regions is mostly the key limiting factor, empirical models with low data requirements are suitable for first overviews. In this study a susceptibility analysis was carried out for the Barcelonnette Basin, situated in the southern French Alps. By means of a methodology based on empirical rules for source identification and the empirical angle of reach concept for the 2-D runout computation, a worst-case scenario was first modelled. In a second step, scenarios for high, medium and low frequency events were developed. A comparison with the footprints of a few mapped events indicates reasonable results but suggests a high dependency on the quality of the digital elevation model. This fact emphasises the need for a careful interpretation of the results while remaining conscious of the inherent assumptions of the model used and quality of the input data.
Resumo:
The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.
Resumo:
Five years after the 2005 Pakistan earthquake that triggered multiple mass movements, landslides continue to pose a threat to the population of Azad Kashmir, especially during heavy monsoon rains. The thousands of landslides that were triggered by the 7.6 magnitude earthquake in 2005 were not just due to a natural phenomenon but largely induced by human activities, namely, road building, grazing, and deforestation. The damage caused by the landslides in the study area (381 km2) is estimated at 3.6 times the annual public works budget of Azad Kashmir for 2005 of US$ 1 million. In addition to human suffering, this cost constitutes a significant economic setback to the region that could have been reduced through improved land use and risk management. This article describes interdisciplinary research conducted 18 months after the earthquake to provide a more systemic approach to understanding risks posed by landslides, including the physical, environmental, and human contexts. The goal of this research is twofold: to present empirical data on the social, geological, and environmental contexts in which widespread landslides occurred following the 2005 earthquake; and, second, to describe straightforward methods that can be used for integrated landslide risk assessments in data-poor environments. The article analyzes limitations of the methodologies and challenges for conducting interdisciplinary research that integrates both social and physical data. This research concludes that reducing landslide risk is ultimately a management issue, based in land use decisions and governance.
Resumo:
Asphalt pavements suffer various failures due to insufficient quality within their design lives. The American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide (MEPDG) has been proposed to improve pavement quality through quantitative performance prediction. Evaluation of the actual performance (quality) of pavements requires in situ nondestructive testing (NDT) techniques that can accurately measure the most critical, objective, and sensitive properties of pavement systems. The purpose of this study is to assess existing as well as promising new NDT technologies for quality control/quality assurance (QC/QA) of asphalt mixtures. Specifically, this study examined field measurements of density via the PaveTracker electromagnetic gage, shear-wave velocity via surface-wave testing methods, and dynamic stiffness via the Humboldt GeoGauge for five representative paving projects covering a range of mixes and traffic loads. The in situ tests were compared against laboratory measurements of core density and dynamic modulus. The in situ PaveTracker density had a low correlation with laboratory density and was not sensitive to variations in temperature or asphalt mix type. The in situ shear-wave velocity measured by surface-wave methods was most sensitive to variations in temperature and asphalt mix type. The in situ density and in situ shear-wave velocity were combined to calculate an in situ dynamic modulus, which is a performance-based quality measurement. The in situ GeoGauge stiffness measured on hot asphalt mixtures several hours after paving had a high correlation with the in situ dynamic modulus and the laboratory density, whereas the stiffness measurement of asphalt mixtures cooled with dry ice or at ambient temperature one or more days after paving had a very low correlation with the other measurements. To transform the in situ moduli from surface-wave testing into quantitative quality measurements, a QC/QA procedure was developed to first correct the in situ moduli measured at different field temperatures to the moduli at a common reference temperature based on master curves from laboratory dynamic modulus tests. The corrected in situ moduli can then be compared against the design moduli for an assessment of the actual pavement performance. A preliminary study of microelectromechanical systems- (MEMS)-based sensors for QC/QA and health monitoring of asphalt pavements was also performed.
Resumo:
Efforts are being made by clinicians and researchers to accurately delineate phenotypic traits of individuals at enhanced risk of schizophrenia. This issue is important for a better understanding of the etiopathogenic mechanisms of the disease and for the building up of programs of primary prevention. We suggest that disturbances of subjective experience, although difficult to operationalize, are an important-but until now neglected-core component of schizophrenia spectrum disorders. We advocate the development of valid and reliable instruments in order to allow the assessment of basic symptoms and disturbances of Self-experience. Delineation of vulnerability to schizophrenia cannot rely solely on neuropsychological and neurophysiological data, as prevention programs will be performed mainly by clinicians.
Resumo:
1. Costs of reproduction lie at the core of basic ecological and evolutionary theories, and their existence is commonly invoked to explain adaptive processes. Despite their sheer importance, empirical evidence for the existence and quantification of costs of reproduction in tree species comes mostly from correlational studies, while more comprehensive approaches remain missing. Manipulative experiments are a preferred approach to study cost of reproduction, as they allow controlling for otherwise inherent confounding factors like size or genetic background. 2. Here, we conducted a manipulative experiment in a Pinus halepensis common garden, removing developing cones from a group of trees and comparing growth and reproduction after treatment with a control group. We also estimated phenotypic and genetic correlations between reproductive and vegetative traits. 3. Manipulated trees grew slightly more than control trees just after treatment, with just a transient, marginally non-significant difference. By contrast, larger differences were observed for the number of female cones initiated 1 year after treatment, with an increase of 70% more cones in the manipulated group. Phenotypic and genetic correlations showed that smaller trees invested a higher proportion of their resources in reproduction, compared with larger trees, which could be interpreted as an indirect evidence for costs of reproduction. 4. Synthesis. This research showed a high impact of current reproduction on reproductive potential, even when not significant on vegetative growth. This has strong implications for how we understand adaptive strategies in forest trees and should encourage further interest on their still poorly known reproductive life-history traits.
Resumo:
Meeting design is one of the most critical prerequisites of the success of facilitated meetings but how to achieve the success is not yet fully understood. This study presents a descriptive model of the design of technology supported meetings based on literature findings about the key factors contributing to the success of collaborative meetings, and linking these factors to the meeting design steps by exploring how facilitators consider the factors in practice in their design process. The empirical part includes a multiple-case study conducted among 12 facilitators. The case concentrates on the GSS laboratory at LUT, which has been working on facilitation and GSS for the last fifteen years. The study also includes ‘control’ cases from two comparable institutions. The results of this study highlight both the variances and commonalities among facilitators in how they design collaboration processes. The design thinking of facilitators of all levels of experience is found to be largely consistent wherefore the key design factors as well as their role across the design process can be outlined. Session goals, group composition, supporting technology, motivational aspects, physical constraints, and correct design practices were found to outline the key factors in design thinking. These factors are further categorized into three factor types of controllable, constraining, and guiding design factors, because the study findings indicate the factor type to have an effect on the factor’s importance in design. Furthermore, the order of considering these factors in the design process is outlined.
Resumo:
Fatal and permanently disabling accidents form only one per I cent of all occupational accidents but in many branches of industry they account for more than half the accident costs. Furthermore the human suffering of the victim and his family is greater in severe accidents than in slight ones. For both human and economic reasons the severe accident risks should be identified befor injuries occur. It is for this purpose that different safety analysis methods have been developed . This study shows two new possible approaches to the problem.. The first is the hypothesis that it is possible to estimate the potential severity of accidents independent of the actual severity. The second is the hypothesis that when workers are also asked to report near accidents, they are particularly prone to report potentially severe near accidents on the basis of their own subjective risk assessment. A field study was carried out in a steel factory. The results supported both the hypotheses. The reliability and the validity of post incident estimates of an accident's potential severity were reasonable. About 10 % of accidents were estimated to be potentially critical; they could have led to death or very severe permanent disability. Reported near accidents were significantly more severe, about 60 $ of them were estimated to be critical. Furthermore the validity of workers subjective risk assessment, manifested in the near accident reports, proved to be reasonable. The studied new methods require further development and testing. They could be used both in routine usage in work places and in research for identifying and setting the priorities of accident risks.
Resumo:
Helicobacter pylori (H. pylori) is a gram negative bacteria that represents a considerable global burden in the world and is related to many gastrointestinal diseases (peptic ulcer, gastric MALT lymphoma or gastric cancer). Currently the triple standard therapy is less used as there is an increase of the clarithromycin resistance. Therefore patients have to receive several lines of treatment with the consequence of adverse events and the possibility to interrupt the treatment. This is why the main objective is to determine if making a culture and antibiogram to do a targeted treatment cause less adverse events with the same eradication than making an empirical treatment to eradicate H. pylori. The secondary objective is to determine the prevalence of resistance to clarithromycin in the province of GironaThis is a multicentre clinical trial without blinding; patients are selected by non-probabilistic sampling, with a total sample of 868 patients randomized in two equal groups of 434 patients in each group. The study will last 2 years. The endpoints will be to evaluate the adverse events and eradication of each group of patients. Also it will be evaluated the resistance to clarithromycin
Resumo:
The objective of this case study is to provide a Finnish solution provider company an objective, in-depth analysis of their project based business and especially of project estimation accuracy. A project and customer profitability analysis is conducted as a complementary addition to describe profitability of the Case Company’s core division. The theoretical framework is constructed on project profitability and customer profitability analysis. Project profitability is approached starting from managing projects, continuing to project pricing process and concluding to project success. The empirical part of this study describes the Case Company’s project portfolio, and by means of quantitative analysis, the study describes how the characteristics of a project impact the project’s profitability. The findings indicate that it really makes a difference in project portfolio’s estimated and actual profitability when methods of installation and technical specifications are scrutinized. Implications on profitability are gathered into a risk assessment tool proposal.
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
Credit risk assessment is an integral part of banking. Credit risk means that the return will not materialise in case the customer fails to fulfil its obligations. Thus a key component of banking is setting acceptance criteria for granting loans. Theoretical part of the study focuses on key components of credit assessment methods of Banks in the literature when extending credits to large corporations. Main component is Basel II Accord, which sets regulatory requirement for credit risk assessment methods of banks. Empirical part comprises, as primary source, analysis of major Nordic banks’ annual reports and risk management reports. As secondary source complimentary interviews were carried out with senior credit risk assessment personnel. The findings indicate that all major Nordic banks are using combination of quantitative and qualitative information in credit risk assessment model when extending credits to large corporations. The relative input of qualitative information depends on the selected approach to the credit rating, i.e. point-in-time or through-the-cycle.