959 resultados para empirical testing
Resumo:
There is nothing more difficult to plan, more Doubtful of success, nor more dangerous to manage Than the creation of a new system. For the initiator has the enmity of all who would Profit by the preservation of the old system, and Merely lukewarm defenders in those who should gain By the new one. N. Machiavelli (1513) Abstract: The purpose of this paper is twofold. First, we want to challenge the notion of "human capital" as "education, training and work experience" and suggest that it is the "quality of the workforce" that matters, here defined as the set of characteristics that allow workers to function in a specific institutional and historical context. Our main conclusion is that the quality of the workforce is affected by the institutional environment where the workers live and that therefore it can vary across countries and institutional contexts. Second, we want to show the empirical relevance of this last point by testing the extent to which the quality of institutions (here proxied by the governance indicators of Kaufmann etal. (2007)) can affect the quality of the workforce (proxied by the percentage of the working age population registered in a lifelong learning program). Our empirical analysis is conducted on a data-set of 11 European countries observed over the period 1996-2006. The results indicate that countries with better governance indicators are also endowed with a more qualified workforce. © 2011 American Journal of Economics and Sociology, Inc.
Resumo:
Our research focused on testing various characteristics of pairwise comparison (PC) matrices in controlled experiments. About 270 students have been involved in the test exercises and the final pool contained 450 matrices. Our team conducted experiments with matrices of different size obtained from different types of MADM problems. The matrix elements have been generated by different questioning orders, too. The cases have been divided into 18 subgroups according to the key factors to be analyzed. The testing environment made it possible to analyze the dynamics of inconsistency as the number of elements increased in a given case. Various types of inconsistency indices have been applied. The consequent behavior of the decision maker has also been analyzed in case of incomplete matrices using indicators to measure the deviation from the final ranking of alternatives and from the final score vector.
Resumo:
A versenyképesség, illetve a gazdaságos működés elengedhetetlen feltétele a fogyasztói elégedettség, melynek egyik meghatározó eleme az észlelt és elvárt minőség közti kapcsolat. A minőségi elvárások az internettel, mint napjaink egyik meghatározó csatornájával kapcsolatban is megfogalmazódtak már, így kapott jelentős szerepet az online szolgáltatásminőség meghatározása, illetve ezzel összekapcsolódva az online-fogyasztói elégedettségmérés. A tanulmány célja, hogy szakirodalmi áttekintést nyújtson a témában, és a szakirodalomból ismert E-S-QUAL és E-RecS-QUAL online-fogyasztói elégedettségmérésre szolgáló skálát megvizsgálja, érvényességét a magyar körülmények között letesztelje, és a szükségesnek látszó módosítások elvégzésével egy Magyarországon használható skálát hozzon létre. Az online-fogyasztók elégedettségmérésének alapjaként az online szolgáltatásminőség fogyasztói érzékelésével, illetve értékelésével kapcsolatos elméleteket járja körbe a tanulmány, és ezután kerül sor a különböző mérési módszerek bemutatására, kiemelt szerepet szánva az E-S-QUAL és E-RecS-QUAL skálának, mely az egyik leginkább alkalmazott módszernek számít. Az áttekintés középpontjában azok a honlapok állnak, melyeken vásárolni is lehet, a kutatást pedig az egyik jelentős hazai online könyvesbolt ügyfélkörében végeztem el. ______ Over the last decade the business-to-consumer online market has been growing very fast. In marketing literature a lot of studies have been created focusing on understanding and measuring e-service quality (e-sq) and online-customer satisfaction. The aim of the study is to summarize these concepts, analyse the relationship between e-sq and customer’s loyalty, which increases the competitiveness of the companies, and to create a valid and reliable scale to the Hungarian market for measuring online-customer satisfaction. The base of the empirical study is the E-S-QUAL and its second scale the E-RecS-QUAL that are widely used multiple scales measuring e-sq with seven dimensions: efficiency, system availability, fulfilment, privacy, responsiveness, compensation, and contact. The study is focusing on the websites customers use to shop online.
Resumo:
This dissertation examines the consequences of Electronic Data Interchange (EDI) use on interorganizational relations (IR) in the retail industry. EDI is a type of interorganizational information system that facilitates the exchange of business documents in structured, machine processable form. The research model links EDI use and three IR dimensions--structural, behavioral, and outcome. Based on relevant literature from organizational theory and marketing channels, fourteen hypotheses were proposed for the relationships among EDI use and the three IR dimensions.^ Data were collected through self-administered questionnaires from key informants in 97 retail companies (19% response rate). The hypotheses were tested using multiple regression analysis. The analysis supports the following hypothesis: (a) EDI use is positively related to information intensity and formalization, (b) formalization is positively related to cooperation, (c) information intensity is positively related to cooperation, (d) conflict is negatively related to performance and satisfaction, (e) cooperation is positively related to performance, and (f) performance is positively related to satisfaction. The results support the general premise of the model that the relationship between EDI use and satisfaction among channel members has to be viewed within an interorganizational context.^ Research on EDI is still in a nascent stage. By identifying and testing relevant interorganizational variables, this study offers insights for practitioners managing boundary-spanning activities in organizations using or planning to use EDI. Further, the thesis provides avenues for future research aimed at understanding the consequences of this interorganizational information technology. ^
Resumo:
Knowledge-based radiation treatment is an emerging concept in radiotherapy. It
mainly refers to the technique that can guide or automate treatment planning in
clinic by learning from prior knowledge. Dierent models are developed to realize
it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This
model can automatically determine both beam conguration and optimization ob-
jectives with non-coplanar beams based on patient-specic anatomical information.
Although plans automatically generated by this model demonstrate equivalent or
better dosimetric quality compared to clinical approved plans, its validity and gener-
ality are limited due to the empirical assignment to a coecient called angle spread
constraint dened in the beam eciency index used for beam ranking. To eliminate
these limitations, a systematic study on this coecient is needed to acquire evidences
for its optimal value.
To achieve this purpose, eleven lung cancer patients with complex tumor shape
with non-coplanar beams adopted in clinical approved plans were retrospectively
studied in the frame of the automatic lung IMRT treatment algorithm. The primary
and boost plans used in three patients were treated as dierent cases due to the
dierent target size and shape. A total of 14 lung cases, thus, were re-planned using
the knowledge-based automatic lung IMRT planning algorithm by varying angle
spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency
index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good
as possible. Important dosimetric parameters for PTV and OARs, quantitatively
re
ecting the plan quality, were extracted from the DVHs and analyzed as a function
of angle spread constraint for each case. Comparisons of these parameters between
clinical plans and model-based plans were evaluated by two-sampled Students t-tests,
and regression analysis on a composite index built on the percentage errors between
dosimetric parameters in the model-based plans and those in the clinical plans as a
function of angle spread constraint was performed.
Results show that model-based plans generally have equivalent or better quality
than clinical approved plans, qualitatively and quantitatively. All dosimetric param-
eters except those for lungs in the automatically generated plans are statistically
better or comparable to those in the clinical plans. On average, more than 15% re-
duction on conformity index and homogeneity index for PTV and V40, V60 for heart
while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The
intra-plan comparison among model-based plans demonstrates that plan quality does
not change much with angle spread constraint larger than 0.4. Further examination
on the variation curve of the composite index as a function of angle spread constraint
shows that 0.6 is the optimal value that can result in statistically the best achievable
plans.
Testing a gravity-based accessibility instrument to engage stakeholders into integrated LUT planning
Resumo:
The paper starts from the concern that while there is a large body of literature focusing on the theoretical definitions and measurements of accessibility, the extent to which such measures are used in planning practice is less clear. Previous reviews of accessibility instruments have in fact identified a gap between the clear theoretical assumptions and the infrequent applications of accessibility instruments in spatial and transport planning. In this paper we present the results of a structured-workshop involving private and public stakeholders to test usability of gravity-based accessibility measures (GraBaM) to assess integrated land-use and transport policies. The research is part of the COST Action TU1002 “Accessibility Instruments for Planning Practice” during which different accessibility instruments where tested for different case studies. Here we report on the empirical case study of Rome.
Resumo:
In establishing the reliability of performance-related design methods for concrete – which are relevant for resistance against chloride-induced corrosion - long-term experience of local materials and practices and detailed knowledge of the ambient and local micro-climate are critical. Furthermore, in the development of analytical models for performance-based design, calibration against test data representative of actual conditions in practice is required. To this end, the current study presents results from full-scale, concrete pier-stems under long-term exposure to a marine environment with work focussing on XS2 (below mid-tide level) in which the concrete is regarded as fully saturated and XS3 (tidal, splash and spray) in which the concrete is in an unsaturated condition. These exposures represent zones where concrete structures are most susceptible to ionic ingress and deterioration. Chloride profiles and chloride transport behaviour are studied using both an empirical model (erfc function) and a physical model (ClinConc). The time dependency of surface chloride concentration (Cs) and apparent diffusivity (Da) were established for the empirical model whereas, in the ClinConc model (originally based on saturated concrete), two new environmental factors were introduced for the XS3 environmental exposure zone. Although the XS3 is considered as one environmental exposure zone according to BS EN 206-1:2013, the work has highlighted that even within this zone, significant changes in chloride ingress are evident. This study aims to update the parameters of both models for predicting the long term transport behaviour of concrete subjected to environmental exposure classes XS2 and XS3.
Resumo:
The shift from decentralized to centralized A-level examinations (Abitur) was implemented in the German school system as a measure of Educational Governance in the last decade. This reform was mainly introduced with the intention of providing higher comparability of school examinations and student achievement as well as increasing fairness in school examinations. It is not known yet if these ambitious aims and functions of the new centralized examination format have been achieved and if fairer assessment can be guaranteed in terms of providing all students with the same opportunities to pass the examinations by allocating fair tests to different student subpopulations e.g., students of different background or gender. The research presented in this article deals with these questions and focuses on gender differences. It investigates gender-specific fairness of the test items in centralized Abitur examinations as high school exit examinations in Germany. The data are drawn from Abitur examinations in English (as a foreign language). Differential item functioning (DIF) analysis reveals that at least some parts of the examinations indicate gender inequality. (DIPF/Orig.)
Resumo:
Doutoramento em Economia
Resumo:
One of the current challenges in model-driven engineering is enabling effective collaborative modelling. Two common approaches are either storing the models in a central repository, or keeping them under a traditional file-based version control system and build a centralized index for model-wide queries. Either way, special attention must be paid to the nature of these repositories and indexes as networked services: they should remain responsive even with an increasing number of concurrent clients. This paper presents an empirical study on the impact of certain key decisions on the scalability of concurrent model queries, using an Eclipse Connected Data Objects model repository and a Hawk model index. The study evaluates the impact of the network protocol, the API design and the internal caching mechanisms and analyzes the reasons for their varying performance.
An empirical investigation of the impact of global energy transition on Nigerian oil and gas exports
Resumo:
18 months embargo on the thesis and check appendix for copy right materials
Resumo:
Frame. Assessing the difficulty of source texts and parts thereof is important in CTIS, whether for research comparability, for didactic purposes or setting price differences in the market. In order to empirically measure it, Campbell & Hale (1999) and Campbell (2000) developed the Choice Network Analysis (CNA) framework. Basically, the CNA’s main hypothesis is that the more translation options (a group of) translators have to render a given source text stretch, the higher the difficulty of that text stretch will be. We will call this the CNA hypothesis. In a nutshell, this research project puts the CNA hypothesis to the test and studies whether it does actually measure difficulty. Data collection. Two groups of participants (n=29) of different profiles and from two universities in different countries had three translation tasks keylogged with Inputlog, and filled pre- and post-translation questionnaires. Participants translated from English (L2) into their L1s (Spanish or Italian), and worked—first in class and then at home—using their own computers, on texts ca. 800–1000 words long. Each text was translated in approximately equal halves in two 1-hour sessions, in three consecutive weeks. Only the parts translated at home were considered in the study. Results. A very different picture emerged from data than that which the CNA hypothesis might predict: there was no prevalence of disfluent task segments when there were many translation options, nor was a prevalence of fluent task segments associated to fewer translation options. Indeed, there was no correlation between the number of translation options (many and few) and behavioral fluency. Additionally, there was no correlation between pauses and both behavioral fluency and typing speed. The discussed theoretical flaws and the empirical evidence lead to the conclusion that the CNA framework does not and cannot measure text and translation difficulty.
Resumo:
The aim of this study was to estimate barite mortar attenuation curves using X-ray spectra weighted by a workload distribution. A semi-empirical model was used for the evaluation of transmission properties of this material. Since ambient dose equivalent, H(⁎)(10), is the radiation quantity adopted by IAEA for dose assessment, the variation of the H(⁎)(10) as a function of barite mortar thickness was calculated using primary experimental spectra. A CdTe detector was used for the measurement of these spectra. The resulting spectra were adopted for estimating the optimized thickness of protective barrier needed for shielding an area in an X-ray imaging facility.
Resumo:
Primary X-ray spectra were measured in the range of 80-150kV in order to validate a computer program based on a semiempirical model. The ratio between the characteristic and total air Kerma was considered to compare computed results and experimental data. Results show that the experimental spectra have higher first HVL and mean energy than the calculated ones. The ratios between the characteristic and total air Kerma for calculated spectra are in good agreement with experimental results for all filtrations used.
Resumo:
Size distributions in woody plant populations have been used to assess their regeneration status, assuming that size structures with reverse-J shapes represent stable populations. We present an empirical approach of this issue using five woody species from the Cerrado. Considering count data for all plants of these five species over a 12-year period, we analyzed size distribution by: a) plotting frequency distributions and their adjustment to the negative exponential curve and b) calculating the Gini coefficient. To look for a relationship between size structure and future trends, we considered the size structures from the first census year. We analyzed changes in number over time and performed a simple population viability analysis, which gives the mean population growth rate, its variance and the probability of extinction in a given time period. Frequency distributions and the Gini coefficient were not able to predict future trends in population numbers. We recommend that managers should not use measures of size structure as a basis for management decisions without applying more appropriate demographic studies.