881 resultados para War, Cost of


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives In China, “serious road traffic crashes” (SRTCs) are those in which there are 10-30 fatalities, 50-100 serious injuries or a total cost of 50-100 million RMB ($US8-16m), and “particularly serious road traffic crashes” (PSRTCs) are those which are more severe or costly. Due to the large number of fatalities and injuries as well as the negative public reaction they elicit, SRTCs and PSRTCs have become great concerns to China during recent years. The aim of this study is to identify the main factors contributing to these road traffic crashes and to propose preventive measures to reduce their number. Methods 49 contributing factors of the SRTCs and PSRTCs that occurred from 2007 to 2013 were collected from the database “In-depth Investigation and Analysis System for Major Road traffic crashes” (IIASMRTC) and were analyzed through the integrated use of principal component analysis and hierarchical clustering to determine the primary and secondary groups of contributing factors. Results Speeding and overloading of passengers were the primary contributing factors, featuring in up to 66.3% and 32.6% of accidents respectively. Two secondary contributing factors were road-related: lack of or nonstandard roadside safety infrastructure, and slippery roads due to rain, snow or ice. Conclusions The current approach to SRTCs and PSRTCs is focused on the attribution of responsibility and the enforcement of regulations considered relevant to particular SRTCs and PSRTCs. It would be more effective to investigate contributing factors and characteristics of SRTCs and PSRTCs as a whole, to provide adequate information for safety interventions in regions where SRTCs and PSRTCs are more common. In addition to mandating of a driver training program and publicisation of the hazards associated with traffic violations, implementation of speed cameras, speed signs, markings and vehicle-mounted GPS are suggested to reduce speeding of passenger vehicles, while increasing regular checks by traffic police and passenger station staff, and improving transportation management to increase income of contractors and drivers are feasible measures to prevent overloading of people. Other promising measures include regular inspection of roadside safety infrastructure, and improving skid resistance on dangerous road sections in mountainous areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a generic method/model for multi-objective design optimization of laminated composite components, based on vector evaluated particle swarm optimization (VEPSO) algorithm. VEPSO is a novel, co-evolutionary multi-objective variant of the popular particle swarm optimization algorithm (PSO). In the current work a modified version of VEPSO algorithm for discrete variables has been developed and implemented successfully for the, multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are - the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria; failure mechanism based failure criteria, Maximum stress failure criteria and the Tsai-Wu failure criteria. The optimization method is validated for a number of different loading configurations - uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences, as well fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the potential harm to patients (and others) and the financial cost of providing futile treatment at the end of life, this practice occurs. This article reports on empirical research undertaken in Queensland that explores doctors’ perceptions about the law that governs futile treatment at the end of life, and the role it plays in medical practice. The findings reveal that doctors have poor knowledge of their legal obligations and powers when making decisions about withholding or withdrawing futile treatment at the end of life; their attitudes towards the law were largely negative; and the law affected their clinical practice and had or would cause them to provide futile treatment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social work in health care has been established for more than 100 years and is one of the largest areas of practice for social workers. Over time, demographic changes and growth in the aging population, increased longevity rates, an explosion in rates of chronic illness together with rapidly increasing cost of health care have created serious challenges for acute hospitals and health social workers. This article reviews the Australian health care system and policies with particular emphasis on the public hospital system. It then examines current hospital social work roles, including the continued role in discharge planning and expanding responsibility for emerging client problems, such as patient complexity, legal, and carer issues. The article concludes with a discussion of evolving issues and challenges facing health social work to ensure that social work remain relevant within this practice context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper reports a measurement of the cross section for the pair production of top quarks in ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron. The data was collected from the CDF II detector in a set of runs with a total integrated luminosity of 1.1 fb^{-1}. The cross section is measured in the dilepton channel, the subset of ttbar events in which both top quarks decay through t -> Wb -> l nu b where l = e, mu, or tau. The lepton pair is reconstructed as one identified electron or muon and one isolated track. The use of an isolated track to identify the second lepton increases the ttbar acceptance, particularly for the case in which one W decays as W -> tau nu. The purity of the sample may be further improved at the cost of a reduction in the number of signal events, by requiring an identified b-jet. We present the results of measurements performed with and without the request of an identified b-jet. The former is the first published CDF result for which a b-jet requirement is added to the dilepton selection. In the CDF data there are 129 pretag lepton + track candidate events, of which 69 are tagged. With the tagging information, the sample is divided into tagged and untagged sub-samples, and a combined cross section is calculated by maximizing a likelihood. The result is sigma_{ttbar} = 9.6 +/- 1.2 (stat.) -0.5 +0.6 (sys.) +/- 0.6 (lum.) pb, assuming a branching ratio of BR(W -> ell nu) = 10.8% and a top mass of m_t = 175 GeV/c^2.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ERP system implementations have evolved so rapidly that now they represent a must-have within industries. ERP systems are viewed as the cost of doing business. Yet, the research that adopted the resource-based view on the business value of ERP systems concludes that companies may gain competitive advantage when they successfully manage their ERP projects, when they carefully reengineer the organization and when they use the system in line with the organizational strategies. This thesis contributes to the literature on ERP business value by examining key drivers of ERP business value in organizations. The first research paper investigates how ERP systems with different degrees of system functionality are correlated with the development of the business performance after the completion of the ERP projects. The companies with a better perceived system functionality obtained efficiency benefits in the first two years of post-implementation. However, in the third year there is no significant difference in efficiency benefits between successfully and less successfully managed ERP projects. The second research paper examines what business process changes occur in companies implementing ERP for different motivations and how these changes impact the business performance. The findings show that companies reported process changes mainly in terms of workflow changes. In addition, the companies having a business-led motivation focused more on observing average costs of each increase in the input unit. Companies having a technological-led motivation focused more on the benefits coming from the fit of the system with the organizational processes. The third research paper considers the role of alignment between ERP and business strategies for the realization of business value from ERP use. These findings show that strategic alignment and business process changes are significantly correlated with the perceived benefits of ERP at three levels: internal efficiency, customers and financial. Overall, by combining quantitative and qualitative research methods, this thesis puts forward a model that illustrates how successfully managed ERP projects, aligned with the business strategy, have automate and informate effects on processes that ultimately improve the customer service and reduce the companies’ costs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to deepen our knowledge of the combined use of estramustine and radiotherapy in the treatment of prostate cancer. Prostate cancer is a common disease, with a high variability between subjects in its malignant potential. In many cases, the disease is an incidental finding with little or no clinical significance. In other cases, however, prostate cancer may be an aggressive malignant disease, which, if the initial treatment fails, lacks an effective cure and may lead to severe symptoms, metastasis, and death despite all treatment. In many cases, the methods of treatment available at the moment provide cure or significant regression of symptoms, but often at the cost of considerable side effects. Estramustine, a cytostatic drug used for treating advanced cancer of the prostate, has been shown to inhibit prostate cancer progression and also to increase the sensitivity of cancer cells to radiotherapy. The goals of this study were, first, to find out whether it is possible to use either estramustine or an antibody against estramustine binding protein as carrier molecules for bringing therapeutic radioisotopes into prostate cancer cells, and, secondly, to gain more understanding of the mechanisms behind the known radiosensitising effect of estramustine. Estramustine and estramustine binding protein antibody were labelled with iodine-125 to study the biodistribution of these substances in mice. In the first experiment, both of the substances accumulated in the prostate, but radioiodinated estramustine also showed affinity to the liver and the lungs. Since the radiolabelled antibody was found out to accumulate more selectively to the prostate, we studied its biodistribution in nude mice with DU-145 human prostate cancer implants. In this experiment, the prostate and the tumour accumulated more radioactivity than other organs, but we concluded that the difference in the dose of radiation compared to other organs was not sufficient for the radioiodinated antibody to be advocated as a carrier molecule for treating prostate cancer. Mice with similar DU-145 prostate cancer implants were then treated with estramustine and external beam irradiation, with and without neoadjuvant estramustine treatment. The tumours responded to the treatment as expected, showing the radiation potentiating effect of estramustine. In the third experiment, this effect was found without an increase in the amount of apoptosis in the tumour cells, despite previous suggestions to the contrary. In the fourth experiment, we gave a similar treatment to the mice with DU-145 tumours. A reduction in proliferation was found in the groups treated with radiotherapy, and an increased amount of tumour hypoxia and tumour necrosis in the group treated with both neoadjuvant estramustine and radiation. This finding is contradictory to the suggestion that the radiation sensitising effect of estramustine could be attributed to its angiogenic activity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we present a generic method/model for multi-objective design optimization of laminated composite components, based on Vector Evaluated Artificial Bee Colony (VEABC) algorithm. VEABC is a parallel vector evaluated type, swarm intelligence multi-objective variant of the Artificial Bee Colony algorithm (ABC). In the current work a modified version of VEABC algorithm for discrete variables has been developed and implemented successfully for the multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria: failure mechanism based failure criteria, maximum stress failure criteria and the tsai-wu failure criteria. The optimization method is validated for a number of different loading configurations-uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences, as well fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. Finally the performance is evaluated in comparison with other nature inspired techniques which includes Particle Swarm Optimization (PSO), Artificial Immune System (AIS) and Genetic Algorithm (GA). The performance of ABC is at par with that of PSO, AIS and GA for all the loading configurations. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Linear optimization model was used to calculate seven wood procurement scenarios for years 1990, 2000 and 2010. Productivity and cost functions for seven cutting, five terrain transport, three long distance transport and various work supervision and scaling methods were calculated from available work study reports. All method's base on Nordic cut to length system. Finland was divided in three parts for description of harvesting conditions. Twenty imaginary wood processing points and their wood procurement areas were created for these areas. The procurement systems, which consist of the harvesting conditions and work productivity functions, were described as a simulation model. In the LP-model the wood procurement system has to fulfil the volume and wood assortment requirements of processing points by minimizing the procurement cost. The model consists of 862 variables and 560 restrictions. Results show that it is economical to increase the mechanical work in harvesting. Cost increment alternatives effect only little on profitability of manual work. The areas of later thinnings and seed tree- and shelter wood cuttings increase on cost of first thinnings. In mechanized work one method, 10-tonne one grip harvester and forwarder, is gaining advantage among other methods. Working hours of forwarder are decreasing opposite to the harvester. There is only little need to increase the number of harvesters and trucks or their drivers from today's level. Quite large fluctuations in level of procurement and cost can be handled by constant number of machines, by alternating the number of season workers and by driving machines in two shifts. It is possible, if some environmental problems of large scale summer time harvesting can be solved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, the design basis of the conventional Khadi and Village Industries Commission biogas plants has been elucidated. It has been shown that minimisation of the cost of the gas holder alone leads to the narrow and deep digesters of conventional plants. If instead, the total capital cost of the gas holder plus digester is minimised, the optimisation leads to wide and shallow digesters, which are less expensive. To test this alternative, two prototype plants have been designed, constructed and operated. These plants are not only 25–40% cheaper, but their performance is actually slightly better than the conventional plants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new algorithm based on signal subspace approach is proposed for localizing a sound source in shallow water. In the first instance we assumed an ideal channel with plane parallel boundaries and known reflection properties. The sound source is assumed to emit a broadband stationary stochastic signal. The algorithm takes into account the spatial distribution of all images and reflection characteristics of the sea bottom. It is shown that both range and depth of a source can be measured accurately with the help of a vertical array of sensors. For good results the number of sensors should be greater than the number of significant images; however, localization is possible even with a smaller array but at the cost of higher side lobes. Next, we allowed the channel to be stochastically perturbed; this resulted in random phase errors in the reflection coefficients. The most singular effect of the phase errors is to introduce into the spectral matrix an extra term which may be looked upon as a signal generated coloured noise. It is shown through computer simulations that the signal peak height is reduced considerably as a consequence of random phase errors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Master’s thesis is qualitative research based on interviews of 15 Chinese immigrants to Finland in order to provide a sociological perspective of the migration experience through the eyes of Chinese immigrants in the Finnish social welfare context. This research is mainly focused upon four crucial aspects of life in the settlement process: housing, employment, access to health care and child care. Inspired by Allardt’s theoretical framework ‘Having, Loving and Being’, social relationships and individual satisfaction are examined in the case of Chinese interviewees dealing with the four life aspects. Finland was not perceived as an attractive migration destination for most Chinese interviewees in the beginning. However, with longer residence in Finland, the Finnish social welfare system gradually became a crucial appealing factor in their permanent settlement in Finland. And meanwhile, social responsibility of attending their old parents in China, strong feelings of being isolated in Finland, and insufficient integration into the Finnish society were influential factors for their decision of returning to China. Social relationships with personal friends, migration brokers, schools, employers and family relatives had great influences in the four life aspects of Chinese immigrants in Finland. The social relationship with the Finnish social welfare sector is supportive to Chinese immigrants, but Chinese immigrants do not heavily rely on Finnish social protection. The housing conditions were greatly improved over time while the upward mobility in the Finnish labour market was not significant among Chinese immigrants. All Chinese immigrants were satisfied with their current housing by the time I interviewed them while most of them had subjective feelings of being alienated in the Finnish labour market, which seriously prevented them from integrating into the Finnish society. In general, Chinese immigrants were satisfied with the low cost of accessing the Finnish public health care services and affordable Finnish child day care services and financial subsidies for children from the Finnish social welfare sector. This research also suggests that employment is the central basis in well-being. Support from the Finnish social welfare sector can improve the satisfaction levels among immigrants, especially when it mitigates the effects of low-paid employment. As well, my empirical study of Chinese immigrants in Finland shows that Having (needs for materials), Loving (needs for social relations) and Being (needs for social integration) are all involved in the four concrete aspects (housing, employment, access to health care and child care).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For a population made up of individuals capable of sexual as well as asexual modes of reproduction, conditions for the spread of a transposable element are explored using a one-locus, two-haplotype model. The analysis is then extended to include the possibility that the transposable element can modulate the probability of sexual reproduction, thus casting Hickey’s (1982,Genetics 101: 519–531) suggestion in a population genetics framework. The model explicitly includes the cost of sexual reproduction, fitness disadvantage to the transposable element, probability of transposition, and the predisposition for sexual reproduction in the presence and absence of the transposable element. The model predicts several kinds of outcome, including initial frequency dependence and stable polymorphism. More importantly, it is seen that for a wide range of parameter values, the transposable element can go to fixation. Therefore it is able to convert the population from a predominantly asexual to a predominantly sexual mode of reproduction. Viewed in conjunction with recent results implicating short stretches of apparently non-coding DNA in sex determination (McCoubreyet al. 1988,Science 242: 1146–1151), the model hints at the important role this mechanism could have played in the evolution of sexuality.