76 resultados para weighting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Cause-of-death statistics are an essential component of health information. Despite improvements, underregistration and misclassification of causes make it difficult to interpret the official death statistics. Objective. To estimate consistent cause-specific death rates for the year 2000 and to identify the leading causes of death and premature mortality in the provinces. Methods. Total number of deaths and population size were estimated using the Actuarial Society of South Africa ASSA2000 AIDS and demographic model. Cause-of-death profiles based on Statistics South Africa's 15% sample, adjusted for misclassification of deaths due to ill-defined causes and AIDS deaths due to indicator conditions, were applied to the total deaths by age and sex. Age-standardised rates and years of life lost were calculated using age weighting and discounting. Results. Life expectancy in KwaZulu-Natal and Mpumalanga is about 10 years lower than that in the Western Cape, the province with the lowest mortality rate. HIV/AIDS is the leading cause of premature mortality for all provinces. Mortality due to pre-transitional causes, such as diarrhoea, is more pronounced in the poorer and more rural provinces. In contrast, non-communicable disease mortality is similar across all provinces, although the cause profiles differ. Injury mortality rates are particularly high in provinces with large metropolitan areas and in Mpumalanga. Conclusion. The quadruple burden experienced in all provinces requires a broad range of interventions, including improved access to health care; ensuring that basic needs such as those related to water and sanitation are met; disease and injury prevention; and promotion of a healthy lifestyle. High death rates as a result of HIV/AIDS highlight the urgent need to accelerate the implementation of the treatment and prevention plan. In addition, there is an urgent need to improve the cause-of-death data system to provide reliable cause-of-death statistics at health district level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessment has widely been described as being ‘at the centre of the student experience’. It would be difficult to conceive of the modern teaching university without it. Assessment is accepted as one of the most important tools that an educator can deploy to influence both what and how students learn. Evidence suggests that how students allocate time and effort to tasks and to developing an understanding of the syllabus is affected by the method of assessment utilised and the weighting it is given. This is particularly significant in law schools where law students may be more preoccupied with achieving high grades in all courses than their counterparts from other disciplines. However, well-designed assessment can be seen as more than this. It can be a vehicle for encouraging students to learn and engage more broadly than with the minimums required to complete the assessment activity. In that sense assessment need not merely ‘drive’ learning, but can instead act as a catalyst for further learning beyond what a student had anticipated. In this article we reconsider the potential roles and benefits in legal education of a form of interactive classroom learning we term assessable class participation (‘ACP’), both as part of a pedagogy grounded in assessment and learning theory, and as a platform for developing broader autonomous approaches to learning amongst students. We also consider some of the barriers students can face in ACP and the ways in which teacher approaches to ACP can positively affect the socio-emotional climates in classrooms and thus reduce those barriers. We argue that the way in which a teacher facilitates ACP is critical to the ability to develop positive emotional and learning outcomes for law students, and for teachers themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In providing simultaneous information on expression profiles for thousands of genes, microarray technologies have, in recent years, been largely used to investigate mechanisms of gene expression. Clustering and classification of such data can, indeed, highlight patterns and provide insight on biological processes. A common approach is to consider genes and samples of microarray datasets as nodes in a bipartite graphs, where edges are weighted e.g. based on the expression levels. In this paper, using a previously-evaluated weighting scheme, we focus on search algorithms and evaluate, in the context of biclustering, several variations of Genetic Algorithms. We also introduce a new heuristic “Propagate”, which consists in recursively evaluating neighbour solutions with one more or one less active conditions. The results obtained on three well-known datasets show that, for a given weighting scheme,optimal or near-optimal solutions can be identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis in software engineering presents a novel automated framework to identify similar operations utilized by multiple algorithms for solving related computing problems. It provides a new effective solution to perform multi-application based algorithm analysis, employing fundamentally light-weight static analysis techniques compared to the state-of-art approaches. Significant performance improvements are achieved across the objective algorithms through enhancing the efficiency of the identified similar operations, targeting discrete application domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because brain structure and function are affected in neurological and psychiatric disorders, it is important to disentangle the sources of variation in these phenotypes. Over the past 15 years, twin studies have found evidence for both genetic and environmental influences on neuroimaging phenotypes, but considerable variation across studies makes it difficult to draw clear conclusions about the relative magnitude of these influences. Here we performed the first meta-analysis of structural MRI data from 48 studies on >1,250 twin pairs, and diffusion tensor imaging data from 10 studies on 444 twin pairs. The proportion of total variance accounted for by genes (A), shared environment (C), and unshared environment (E), was calculated by averaging A, C, and E estimates across studies from independent twin cohorts and weighting by sample size. The results indicated that additive genetic estimates were significantly different from zero for all metaanalyzed phenotypes, with the exception of fractional anisotropy (FA) of the callosal splenium, and cortical thickness (CT) of the uncus, left parahippocampal gyrus, and insula. For many phenotypes there was also a significant influence of C. We now have good estimates of heritability for many regional and lobar CT measures, in addition to the global volumes. Confidence intervals are wide and number of individuals small for many of the other phenotypes. In conclusion, while our meta-analysis shows that imaging measures are strongly influenced by genes, and that novel phenotypes such as CT measures, FA measures, and brain activation measures look especially promising, replication across independent samples and demographic groups is necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2013 the OECD released its 15 point Action plan to deal with base erosion and profit shifting (BEPS). In that plan it was recognised that BEPS has a significant effect on developing countries. This is because the lack of tax revenue can lead to a critical underfunding of public investment that would help promote economic growth. To this end, the BEPS project is aimed at ensuring an inclusive approach to take into account not only views of the G20 and OECD countries but also the perspective of developing nations. With this focus in mind and in the context of developing nations, the purpose of this article is to consider a possible solution to profit shifting which occurs under the current transfer pricing regime, with that solution being unitary taxation with formulary apportionment. It does so using the finance sector as a specific case for application. Multinational financial institutions (MNFIs) play a significant role in financing activities of their clients in developing nations. Consistent with the ‘follow-the-client’ phenomenon which explains financial institution expansion, these entities are increasingly profiting from activities associated with this growing market. Further, not only are MNFIs persistent users of tax havens but also, more than other industries, have opportunities to reduce tax through transfer pricing measures. This article establishes a case for an industry specific adoption of unitary taxation with formulary apportionment as a viable alternative to the current regime. It argues that such a model would benefit not only developed nations but also developing nations which are currently suffering the effects of BEPS. In doing so, it considers the practicalities of such an implementation by examining both definitional issues and a possible formula for MNFIs. This article argues that, while there would be implementation difficulties to overcome, the current domestic models of formulary apportionment provide important guidance as to how the unitary business and business activities of MNFIs should be defined as well as factors that should be included in an allocation formula, along with the appropriate weighting. While it would be difficult for developing nations to adopt such a regime, it is argued that it would be no more difficult than addressing issues they face with the current transfer pricing regime. As such, this article concludes that unitary taxation with formulary apportionment is a viable industry specific alternative for MNFIs which would assist developing nations and aid independent fiscal soundness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multinational financial institutions (MNFIs) play a significant role in financing the activities of their clients in developing nations. Consistent with the ‘follow-the-customer’ phenomenon which explains financial institution expansion, these entities are increasingly profiting from activities associated with this growing market. However, not only are MNFIs persistent users of tax havens, but also, more than other industries, have the opportunity to reduce tax through transfer pricing measures. This paper establishes a case for an industry-specific adoption of unitary taxation with formulary apportionment as a viable alternative to the current regime. In doing so, it considers the practicalities of implementing this by examining both definitional issues and possible formulas for MNFIs. This paper argues that, while there would be implementation difficulties to overcome, the current domestic models of formulary apportionment provide important guidance as to how the unitary business and business activities of MNFIs should be defined, as well as the factors that should be included in an allocation formula, and the appropriate weighting. This paper concludes that unitary taxation with formulary apportionment is a viable industry-specific alternative for MNFIs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The community is the basic unit of urban development, and appropriate assessment tools are needed for communities to evaluate and facilitate decision making concerning sustainable community development and reduce the detrimental effects of urban community actions on the environment. Existing research into sustainable community rating tools focuses primarily on those that are internationally recognized to describe their advantages and future challenges. However, the differences between rating tools due to different regional conditions, situations and characteristics have yet to be addressed. In doing this, this paper examines three sustainable community rating tools in Australia, namely Green Star-Communities PILOT, EnviroDevelopment and VicUrban Sustainability Charter (Master Planned Community Assessment Tool). In order to identify their similarities, differences and advantages these are compared in terms of sustainability coverage, prerequisites, adaptation to locality, scoring and weighting, participation, presentation of results, and application process. These results provide the stakeholders of sustainable community development projects with a better understanding of the available rating tools in Australia and assist with evaluation and decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim Determining how ecological processes vary across space is a major focus in ecology. Current methods that investigate such effects remain constrained by important limiting assumptions. Here we provide an extension to geographically weighted regression in which local regression and spatial weighting are used in combination. This method can be used to investigate non-stationarity and spatial-scale effects using any regression technique that can accommodate uneven weighting of observations, including machine learning. Innovation We extend the use of spatial weights to generalized linear models and boosted regression trees by using simulated data for which the results are known, and compare these local approaches with existing alternatives such as geographically weighted regression (GWR). The spatial weighting procedure (1) explained up to 80% deviance in simulated species richness, (2) optimized the normal distribution of model residuals when applied to generalized linear models versus GWR, and (3) detected nonlinear relationships and interactions between response variables and their predictors when applied to boosted regression trees. Predictor ranking changed with spatial scale, highlighting the scales at which different species–environment relationships need to be considered. Main conclusions GWR is useful for investigating spatially varying species–environment relationships. However, the use of local weights implemented in alternative modelling techniques can help detect nonlinear relationships and high-order interactions that were previously unassessed. Therefore, this method not only informs us how location and scale influence our perception of patterns and processes, it also offers a way to deal with different ecological interpretations that can emerge as different areas of spatial influence are considered during model fitting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new dearomatized porphyrinoid, 5,10-diiminoporphodimethene (5,10-DIPD), has been prepared by palladium-catalyzed hydrazination of 5,10-dibromo-15,20-bis(3,5-di-tert-butylphenyl)porphyrin and its nickel(II) complex, by using ethyl and 4-methoxybenzyl carbazates. The oxidative dearomatization of the porphyrin ring occurs in high yield. Further oxidation with 2,3-dichloro-5,6-dicyanobenzoquinone forms the corresponding 5,10-bis(azocarboxylates), thereby restoring the porphyrin aromaticity. The UV/visible spectra of the NiII DIPDs exhibit remarkable redshifts of the lowest-energy bands to 780 nm, and differential pulse voltammetry reveals a contracted electrochemical HOMO–LUMO gap of 1.44 V. Density functional theory (DFT) was used to calculate the optimized geometries and frontier molecular orbitals of model 5,10-DIPD Ni7c and 5,10-bis(azocarboxylate) Ni8c. The conformations of the carbamate groups and the configurations of the CNZ unit were considered in conjunction with the NOESY spectra, to generate the global minimum geometry and two other structures with slightly higher energies. In the absence of solution data regarding conformations, ten possible local minimum conformations were considered for Ni8c. Partition of the porphyrin macrocycle into tri- and monopyrrole fragments in Ni7c and the inclusion of terminal conjugating functional groups generate unique frontier molecular orbital distributions and a HOMO–LUMO transition with a strong element of charge transfer from the monopyrrole ring. Time-dependent DFT calculations were performed for the three lowest-energy structures of Ni7c and Ni8c, and weighting according to their energies allowed the prediction of the electronic spectra. The calculations reproduce the lower-energy regions of the spectra and the overall forms of the spectra with high accuracy, but agreement is not as good in the Soret region below 450 nm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessing airport service performance requires understanding of a complete set of passenger experiences covering all activities from departures to arrivals. Weight-based indicator models allow passengers to express their priority on certain evaluation criteria (airport domains) and their service attributes over the others. The application of multilevel regression analysis in questionnaire design is expected to overcome limitations of traditional questionnaires, which require application of all indicators with equal weight. The development of a Taxonomy of Passenger Activities (TOPA), which captures all passenger processing and discretionary activities, has provided a novel perspective in understanding passenger experience in various airport domains. Based on further literature reviews on various service attributes at airport passenger terminals, this paper constitutes questionnaire design to employ a weighting method for all activities from the time passengers enter an airport domain at the departure terminal until leaving the arrival terminal (i.e. seven airport domains for departure, four airport domains during transit, and seven airport domains for arrival). The procedure of multilevel regression analysis is aimed not only at identifying the ranking of each evaluation criterion from the most important to the least important but also to explain the relationship between service attributes in each airport domain and overall service performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider ranked-based regression models for clustered data analysis. A weighted Wilcoxon rank method is proposed to take account of within-cluster correlations and varying cluster sizes. The asymptotic normality of the resulting estimators is established. A method to estimate covariance of the estimators is also given, which can bypass estimation of the density function. Simulation studies are carried out to compare different estimators for a number of scenarios on the correlation structure, presence/absence of outliers and different correlation values. The proposed methods appear to perform well, in particular, the one incorporating the correlation in the weighting achieves the highest efficiency and robustness against misspecification of correlation structure and outliers. A real example is provided for illustration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adaptions of weighted rank regression to the accelerated failure time model for censored survival data have been successful in yielding asymptotically normal estimates and flexible weighting schemes to increase statistical efficiencies. However, for only one simple weighting scheme, Gehan or Wilcoxon weights, are estimating equations guaranteed to be monotone in parameter components, and even in this case are step functions, requiring the equivalent of linear programming for computation. The lack of smoothness makes standard error or covariance matrix estimation even more difficult. An induced smoothing technique overcame these difficulties in various problems involving monotone but pure jump estimating equations, including conventional rank regression. The present paper applies induced smoothing to the Gehan-Wilcoxon weighted rank regression for the accelerated failure time model, for the more difficult case of survival time data subject to censoring, where the inapplicability of permutation arguments necessitates a new method of estimating null variance of estimating functions. Smooth monotone parameter estimation and rapid, reliable standard error or covariance matrix estimation is obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Troxel, Lipsitz, and Brennan (1997, Biometrics 53, 857-869) considered parameter estimation from survey data with nonignorable nonresponse and proposed weighted estimating equations to remove the biases in the complete-case analysis that ignores missing observations. This paper suggests two alternative modifications for unbiased estimation of regression parameters when a binary outcome is potentially observed at successive time points. The weighting approach of Robins, Rotnitzky, and Zhao (1995, Journal of the American Statistical Association 90, 106-121) is also modified to obtain unbiased estimating functions. The suggested estimating functions are unbiased only when the missingness probability is correctly specified, and misspecification of the missingness model will result in biases in the estimates. Simulation studies are carried out to assess the performance of different methods when the covariate is binary or normal. For the simulation models used, the relative efficiency of the two new methods to the weighting methods is about 3.0 for the slope parameter and about 2.0 for the intercept parameter when the covariate is continuous and the missingness probability is correctly specified. All methods produce substantial biases in the estimates when the missingness model is misspecified or underspecified. Analysis of data from a medical survey illustrates the use and possible differences of these estimating functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In school environments, children are constantly exposed to mixtures of airborne substances, derived from a variety of sources, both in the classroom and in the school surroundings. It is important to evaluate the hazardous properties of these mixtures, in order to conduct risk assessments of their impact on chil¬dren’s health. Within this context, through the application of a Maximum Cumulative Ratio approach, this study aimed to explore whether health risks due to indoor air mixtures are driven by a single substance or are due to cumulative exposure to various substances. This methodology requires knowledge of the concentration of substances in the air mixture, together with a health related weighting factor (i.e. reference concentration or lowest concentration of interest), which is necessary to calculate the Hazard Index. Maximum cumulative ratio and Hazard Index values were then used to categorise the mixtures into four groups, based on their hazard potential and therefore, appropriate risk management strategies. Air samples were collected from classrooms in 25 primary schools in Brisbane, Australia. Analysis was conducted based on the measured concentration of these substances in about 300 air samples. The results showed that in 92% of the schools, indoor air mixtures belonged to the ‘low concern’ group and therefore, they did not require any further assessment. In the remaining schools, toxicity was mainly governed by a single substance, with a very small number of schools having a multiple substance mix which required a combined risk assessment. The proposed approach enables the identification of such schools and thus, aides in the efficient health risk management of pollution emissions and air quality in the school environment.