796 resultados para Empirical Algorithm Analysis
Resumo:
Being at-risk is a growing problem in the U.S. because of disturbing societal trends such as unemployment, divorce, substance abuse, child abuse and neglect, and the new threat of terrorist violence. Resilience characterizes individuals who rebound from or adapt to adversities such as these, and academic resilience distinguishes at-risk students who succeed in school despite hardships. ^ The purpose of this research was to perform a meta-analysis to examine the power of resilience and to suggest ways educators might improve academic resilience, which was operationalized by satisfactory test scores and grades. In order to find all studies that were relevant to academic resilience in at-risk kindergarten through 12th-grade students, extensive electronic and hardcopy searches were conducted, and these resulted in a database of 421 articles. Two hundred eighty seven of these were rejected quickly, because they were not empirical research. Upon further examination, another 106 were rejected for not meeting study protocol criteria. Ultimately, 28 studies were coded for study level descriptors and effect size variables. ^ Protective factors for resilience were found to originate in physical, psychological, and behavioral domains on proximal/intraindividual, transitional/intrafamilial, or distal/extrafamilial levels. Effect sizes (ESs) for these were weighted and the means for each level or category were interpreted by commonly accepted benchmarks. Mean effect sizes for proximal (M = .27) and for transitional (M = .15) were small but significant. The mean effect size for the distal level was insignificant. This supported the hypotheses that the proximal level was the source of most protective factors for academic resilience in at-risk students followed by the transitional level. The distal effect size warranted further research particularly in light of the small number of studies (n = 11) contributing effect sizes to that category. A homogeneity test indicated a search for moderators, i.e., study variables affecting outcomes, was justified. “Category” was the largest moderator. Graphs of weighted mean effect sizes in the physical, psychological, and behavioral domains were plotted for each level to better illustrate the findings of the meta-analysis. Suggestions were made for combining resilience development with aspects of positive psychology to promote resilience in the schools. ^
Do immigrant outflows lead to native inflows? An empirical analysis of the migratory responses to US
Resumo:
Ongoing debates within the professional and academic communities have raised a number of questions specific to the international audit market. This dissertation consists of three related essays that address such issues. First, I examine whether the propensity to switch between auditors of different sizes (i.e., Big 4 versus non-Big 4) changes as adoption of International Financial Reporting Standards (IFRS) becomes a more common phenomenon, arguing that smaller auditors have an opportunity to invest in necessary skills and training needed to enter this market. Findings suggest that clients are relatively less (more) likely to switch to (away from) a Big 4 auditor if the client's adoption of IFRS occurs in more recent years. ^ In the second essay, I draw on these inferences and test whether the change in audit fees in the year of IFRS adoption changes over time. As the market becomes less concentrated, larger auditors becomes less able to demand a premium for their services. Consistent with my arguments, results suggest that the change in audit service fees declines over time, although this effect seems concentrated among the Big 4. I also find that this effect is partially attributable to a differential effect of the auditors' experience in pricing audit services related to IFRS based on the period in which adoption occurs. The results of these two essays offer important implications to policy debates on the costs and benefits of IFRS adoption. ^ In the third essay, I differentiate Big 4 auditors into three classifications—Parent firms, Brand Name affiliates, and Local affiliates—and test for differences in audit fee premiums (relative to non-Big 4 auditors) and audit quality. Results suggest that there is significant heterogeneity between the three classifications based on both of these characteristics, which is an important consideration for future research. Overall, this dissertation provides additional insights into a variety of aspects of the global audit market.^
Resumo:
After a series of major storms over the last 20 years, the state of financing for U.S. natural disaster insurance has undergone substantial disruptions causing many federal and state backed programs against residential property damage to become severally underfunded. In order to regain actuarial soundness, policy makers have proposed a shift to a system that reflects risk-based pricing for property insurance. We examine survey responses from 1394 single-family homeowners in the state of Florida for support of several natural disaster mitigation policy reforms. Utilizing a partial proportional odds model we test for effects of location, risk perception, socio-economic and housing characteristics on support for policy reforms. Our findings suggest residents across the state, not just risk-prone homeowners, support the current subsidized model. We also examine several other policy questions from the survey to verify our initial results. Finally, the implications of our findings are discussed to provide inputs to policymakers.
Resumo:
In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.
Resumo:
Knowledge-based radiation treatment is an emerging concept in radiotherapy. It
mainly refers to the technique that can guide or automate treatment planning in
clinic by learning from prior knowledge. Dierent models are developed to realize
it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This
model can automatically determine both beam conguration and optimization ob-
jectives with non-coplanar beams based on patient-specic anatomical information.
Although plans automatically generated by this model demonstrate equivalent or
better dosimetric quality compared to clinical approved plans, its validity and gener-
ality are limited due to the empirical assignment to a coecient called angle spread
constraint dened in the beam eciency index used for beam ranking. To eliminate
these limitations, a systematic study on this coecient is needed to acquire evidences
for its optimal value.
To achieve this purpose, eleven lung cancer patients with complex tumor shape
with non-coplanar beams adopted in clinical approved plans were retrospectively
studied in the frame of the automatic lung IMRT treatment algorithm. The primary
and boost plans used in three patients were treated as dierent cases due to the
dierent target size and shape. A total of 14 lung cases, thus, were re-planned using
the knowledge-based automatic lung IMRT planning algorithm by varying angle
spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency
index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good
as possible. Important dosimetric parameters for PTV and OARs, quantitatively
re
ecting the plan quality, were extracted from the DVHs and analyzed as a function
of angle spread constraint for each case. Comparisons of these parameters between
clinical plans and model-based plans were evaluated by two-sampled Students t-tests,
and regression analysis on a composite index built on the percentage errors between
dosimetric parameters in the model-based plans and those in the clinical plans as a
function of angle spread constraint was performed.
Results show that model-based plans generally have equivalent or better quality
than clinical approved plans, qualitatively and quantitatively. All dosimetric param-
eters except those for lungs in the automatically generated plans are statistically
better or comparable to those in the clinical plans. On average, more than 15% re-
duction on conformity index and homogeneity index for PTV and V40, V60 for heart
while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The
intra-plan comparison among model-based plans demonstrates that plan quality does
not change much with angle spread constraint larger than 0.4. Further examination
on the variation curve of the composite index as a function of angle spread constraint
shows that 0.6 is the optimal value that can result in statistically the best achievable
plans.
Resumo:
We discuss the interactions among the various phases of network research design in the context of our current work using Mixed Methods and SNA on networks and rural economic development. We claim that there are very intricate inter-dependencies among the various phases of network research design - from theory and formulation of research questions right through to modes of analysis and interpretation. Through examples drawn from our work we illustrate how choices about methods for Sampling and Data Collection are influenced by these interdependencies.
Resumo:
Many studies have shown the considerable potential for the application of remote-sensing-based methods for deriving estimates of lake water quality. However, the reliable application of these methods across time and space is complicated by the diversity of lake types, sensor configuration, and the multitude of different algorithms proposed. This study tested one operational and 46 empirical algorithms sourced from the peer-reviewed literature that have individually shown potential for estimating lake water quality properties in the form of chlorophyll-a (algal biomass) and Secchi disc depth (SDD) (water transparency) in independent studies. Nearly half (19) of the algorithms were unsuitable for use with the remote-sensing data available for this study. The remaining 28 were assessed using the Terra/Aqua satellite archive to identify the best performing algorithms in terms of accuracy and transferability within the period 2001–2004 in four test lakes, namely Vänern, Vättern, Geneva, and Balaton. These lakes represent the broad continuum of large European lake types, varying in terms of eco-region (latitude/longitude and altitude), morphology, mixing regime, and trophic status. All algorithms were tested for each lake separately and combined to assess the degree of their applicability in ecologically different sites. None of the algorithms assessed in this study exhibited promise when all four lakes were combined into a single data set and most algorithms performed poorly even for specific lake types. A chlorophyll-a retrieval algorithm originally developed for eutrophic lakes showed the most promising results (R2 = 0.59) in oligotrophic lakes. Two SDD retrieval algorithms, one originally developed for turbid lakes and the other for lakes with various characteristics, exhibited promising results in relatively less turbid lakes (R2 = 0.62 and 0.76, respectively). The results presented here highlight the complexity associated with remotely sensed lake water quality estimates and the high degree of uncertainty due to various limitations, including the lake water optical properties and the choice of methods.
Resumo:
The role of Constitutional Courts in deeply divided societies is complicated by the danger that the salient societal cleavages may influence judicial decision-making and, consequently, undermine judicial independence and impartiality. With reference to the decisions of the Constitutional Court of Bosnia-Herzegovina, this article investigates the influence of ethno-nationalism on judicial behaviour and the extent to which variation in judicial tenure amplifies or dampens that influence. Based on a statistical analysis of an original dataset of the Court’s decisions, we find that the judges do in fact divide predictably along ethno-national lines, at least in certain types of cases, and that these divisions cannot be reduced to a residual loyalty to their appointing political parties. Contrary to some theoretical expectations, however, we find that long-term tenure does little to dampen the influence of ethno-nationalism on judicial behaviour. Moreover, our findings suggest that the longer a judge serves on the Court the more ethno-national affiliation seems to influence her decision-making. We conclude by considering how alternative arrangements for the selection and tenure of judges might help to ameliorate this problem.
Resumo:
Franchising is an important form of organizational control. Possible benefits of franchising include its ability to reduce agency costs that increase with costly monitoring, and provide incentives for the use of local information by onsite managers. However, these benefits may come at a cost, as franchisees may reduce quality by choosing to free ride. While many studies have investigated the reasons for franchising, few studies have documented the impacts of franchising on unit level operating performance. Using time-series data from a number of lodging properties that were converted to franchisee control from company control, this study documents the performance impacts of franchising. The analysis reveals that conversion results in a modest decline in financial performance and an immediate sharp decline in quality.
Resumo:
Performance improvements subsequent to the implementation of a pay-for-performance plan can result because more productive employees self-select into the firm (selection effect) and/or because employees allocate effort to become more effective (effort effect). We analyze individual performance data for 3,776 sales employees of a retail firm to evaluate these alternative sources of continuing performance improvement. The incentive plan helps the firm attract and retain more productive sales employees, and motivates these employees to further improve their productivity. In contrast, the less productive sales employees’ performance declines before they leave the firm.
Resumo:
Empirical validity of the claim that overhead costs are driven not by production volume but by transactions resulting from production complexity is examined using data from 32 manufacturing plants from the electronics, machinery, and automobile components industries. Transactions are measured using number of engineering change orders, number of purchasing and production planning personnel, shop- floor area per part, and number of quality control and improvement personnel. Results indicate a strong positive relation between manufacturing overhead costs and both manufacturing transactions and production volume. Most of the variation in overhead costs, however, is explained by measures of manufacturing transactions, not volume.
Resumo:
In this thesis, proactive marketing is suggested to be a broader concept than existing research assumes. Although the concept has been mentioned in the context of competitive advantage in previous research, it has not been comprehensively described. This thesis shows that proactive marketing is more than investing in marketing communications of a company. Proactive marketing is described as a three-phased process that contains different customer value identification, creation, and delivery activities. The purpose of proactive marketing is essentially to anticipate and pursue market opportunities that bring value to the company’s stakeholders. Ultimately, proactive marketing aims at acting first on the market, shaping the markets, and thus reaching competitive advantage. The proactive marketing process is supported by the structures of an organization. Suitable structures for proactive marketing are identified in the thesis based on existing research and through an empirical analysis. Moreover, proactive marketing is related to two management theories: the dynamic capabilities framework and the empowerment of employees. A dynamic environment requires companies that pursue proactive marketing to change continuously. Dynamic capabilities are considered as tools of the management, which enable companies to create suitable conditions for the constant change. Empowerment of employees is a management practice that creates proactive behaviors in individuals. The empirical analysis is conducted in an online company operating in the rapidly changing marketplace of the Internet. Through the empirical analysis, the thesis identifies in practice how proactiveness manifests in the marketing process of a company, how organizational structures facilitate proactive marketing, and how proactive marketing is managed. The theoretical contribution of this thesis consist of defining the proactive marketing concept comprehensively and providing further research suggestions related to proactive marketing.