967 resultados para time management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Otitis media (OM) is one of the most common childhood diseases. Approximately every third child suffers from recurrent acute otitis media (RAOM), and 5% of all children have persistent middle ear effusion for months during their childhood. Despite numerous studies on the prevention and treatment of OM during the past decades, its management remains challenging and controversial. In this study, the effect of adenoidectomy on the risk for OM, the potential risk factors influencing the development of OM and the frequency of asthma among otitis-prone children were investigated. Subjects and methods: One prospective randomized trial and two retrospective studies were conducted. In the prospective trial, 217 children with RAOM or chronic otitis media with effusion (COME) were randomized to have tympanostomy with or without adenoidectomy. The age of the children at recruitment was between 1 and 4 years. RAOM was defined as having at least 3 episodes of AOM during the last 6 months or at least 5 episodes of AOM during the last 12 months. COME was defined as having persistent middle ear effusion for 2-3 months. The children were followed up for one year. In the first retrospective study, the frequency of childhood infections and allergy was evaluated by a questionnaire among 819 individuals. In the second retrospective study, data of asthma diagnosis were analysed from hospital discharge records of 1616 children who underwent adenoidectomy or had probing of the nasolacrimal duct. Results: In the prospective randomized study, adenoidectomy had no beneficial effect on the prevention of subsequent episodes of AOM. Parental smoking was found to be a significant risk factor for OM even after the insertion of tympanostomy tubes. The frequencies of exposure to tobacco smoke and day-care attendance at the time of randomization were similar among children with RAOM and COME. However, the frequencies of allergy to animal dust and pollen and parental asthma were lower among children with COME than those with RAOM. The questionnaire survey and the hospital discharge data revealed that children who had frequent episodes of OM had an increased risk for asthma. Conclusions: The first surgical intervention to treat an otitis-prone child younger than 4 years should not include adenoidectomy. Interventions to stop parental smoking could significantly reduce the risk for childhood RAOM. Whether an otitis-prone child develops COME or RAOM, seems to be influenced by genetic predisposition more strongly than by environmental risk factors. Children who suffer from repeated upper respiratory tract infections, like OM, may be at increased risk for developing asthma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exercise that targets ankle joint mobility may lead to improvement in calf muscle pump function and subsequent healing. The objectives of this research were to assess the impact of an exercise intervention in addition to routine evidence-based care on the healing rates, functional ability and health-related quality of life for adults with venous leg ulcers (VLUs). This study included 63 patients with VLUs. Patients were randomised to receive either a 12-week exercise intervention with a telephone coaching component or usual care plus telephone calls at the same timepoints. The primary outcome evaluated the effectiveness of the intervention in relation to wound healing. The secondary outcomes evaluated physical activity, functional ability and health-related quality of life measures between groups at the end of the 12 weeks. A per protocol analysis complemented the effectiveness (intention-to-treat) analysis to highlight the importance of adherence to an exercise intervention. Intention-to-treat analyses for the primary outcome showed 77% of those in the intervention group healed by 12 weeks compared to 53% of those in the usual care group. Although this difference was not statistically significant due to a smaller than expected sample size, a 24% difference in healing rates could be considered clinically significant. The per protocol analysis for wound healing, however, showed that those in the intervention group who adhered to the exercise protocol 75% or more of the time were significantly more likely to heal and showed higher rates for wound healing than the control group (P = 0·01), that is, 95% of those who adhered in the intervention group healed in 12 weeks. The secondary outcomes of physical activity, functional ability and health-related quality of life were not significantly altered by the intervention. Among the secondary outcomes (physical activity, functional ability and health-related quality of life), intention-to-treat analyses did not support the effectiveness of the intervention. However, per protocol analyses revealed encouraging results with those participants who adhered more than 75% of the time (n = 19) showing significantly improved Range of Ankle Motion from the self-management exercise programme (P = 0·045). This study has shown that those participants who adhere to the exercise programme as an adjunctive treatment to standard care are more likely to heal and have better functional outcomes than those who do not adhere to the exercises in conjunction with usual care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the role of corporate philanthropy in the management of reputation risk and shareholder value of the top 100 ASX listed Australian firms for the three years 2011-2013. The results of this study demonstrate the business case for corporate philanthropy and hence encourage corporate philanthropy by showing increasing firms’ investment in corporate giving as a percentage of profit before tax, increases the likelihood of an increase in shareholder value. However, the proviso is that firms must also manage their reputation risk at the same time. There is a negative association between corporate giving and shareholder value (Tobin’s Q) which is mitigated by firms’ management of reputation. The economic significance of this result is that for every cent in the dollar the firm spends on corporate giving, Tobin’s Q will decrease by 0.413%. In contrast, if the firm increase their reputation by 1 point then Tobin’s Q will increase by 0.267%. Consequently, the interaction of corporate giving and reputation risk management is positively associated with shareholder value. These results are robust while controlling for potential endogeneity and reverse causality. This paper assists both academics and practitioners by demonstrating that the benefits of corporate philanthropy extend beyond a gesture to improve reputation or an attempt to increase financial performance, to a direct collaboration between all the factors where the benefits far outweigh the costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the emergence of service marketing, the focus of service research has evolved. Currently the focus of research is shifting towards value co-created by the customer. Consequently, value creation is increasingly less fixed to a specific time or location controlled by the service provider. However, present service management models, although acknowledging customer participation and accessibility, have not considered the role of the empowered customer who may perform the service at various locations and time frames. The present study expands this scope and provides a framework for exploring customer perceived value from a temporal and spatial perspective. The framework is used to understand and analyse customer perceived value and to explore customer value profiles. It is proposed that customer perceived value can be conceptualised as a function of technical, functional, temporal and spatial value dimensions. These dimensions are suggested to have value-increasing and value-decreasing facets. This conceptualisation is empirically explored in an online banking context and it is shown that time and location are more important value dimensions relative to the technical and functional dimensions. The findings demonstrate that time and location are important not only in terms of having the possibility to choose when and where the service is performed. Customers also value an efficient and optimised use of time and a private and customised service location. The study demonstrates that time and location are not external elements that form the service context, but service value dimensions, in addition to the technical and functional dimensions. This thesis contributes to existing service management research through its framework for understanding temporal and spatial dimensions of perceived value. Practical implications of the study are that time and location need to be considered as service design elements in order to differentiate the service from other services and create additional value for customers. Also, because of increased customer control and the importance of time and location, it is increasingly relevant for service providers to provide a facilitating arena for customers to create value, rather than trying to control the value creation process. Kristina Heinonen is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on the time dimension in consumers’ image construction processes. Two new concepts are introduced to cover past consumer experiences about the company – image heritage, and the present image construction process - image-in-use. Image heritage and image-in-use captures the dynamic, relational, social, and contextual features of corporate image construction processes. Qualitative data from a retailing context were collected and analysed following a grounded theory approach. The study demonstrates that consumers’ corporate images have long roots in past experiences. Understanding consumers’ image heritage provides opportunities for understanding how consumers might interpret management initiatives and branding activities in the present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many large mammals such as elephant, rhino and tiger often come into conflict with people by destroying agricultural crops and even killing people, thus providing a deterrent to conservation efforts. The males of these polygynous species have a greater variance in reproductive success than females, leading to selection pressures favouring a ‘high risk-high gain’ strategy for promoting reproductive success. This brings them into greater conflict with people. For instance, adult male elephants are far more prone than a member of a female-led family herd to raid agricultural crops and to kill people. In polygynous species, the removal of a certain proportion of ‘surplus’ adult males is not likely to affect the fertility and growth rate of the population. Hence, this could be a management tool which would effectively reduce animal-human conflict, and at the same time maintain the viability of the population. Selective removal of males would result in a skewed sex ratio. This would reduce the ‘effective population size’ (as opposed to the total population or census number), increase the rate of genetic drift and, in small populations, lead to inbreeding depression. Plans for managing destructive mammals through the culling of males will have to ensure that the appropriate minimum size in the populations is being maintained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the thesis is to assess the fishery of Baltic cod, herring and sprat by simulation over 50 years time period. We form a bioeconomic multispecies model for the species. We include species interactions into the model because especially cod and sprat stocks have significant effects on each other. We model the development of population dynamics, catches and profits of the fishery with current fishing mortalities, as well as with the optimal profit maximizing fishing mortalities. Thus, we see how the fishery would develop with current mortalities, and how the fishery should be developed in order to yield maximal profits. Especially cod stock has been quite low recently and by optimizing the fishing mortality it could get recovered. In addition, we assess what would happen to the fisheries of the species if more favourable environmental conditions for cod recruitment dominate in the Baltic Sea. The results may yield new information for the fisheries management. According to the results the fishery of Baltic cod, herring and sprat are not at the most profitable level. The fishing mortalities of each species should be lower in order to maximize the profits. By fishing mortality optimizing the net present value would be almost three times higher in the simulation period. The lower fishing mortality of cod would result in a cod stock recovery. If the environmental conditions in the Baltic Sea improved, cod stock would recover even without a decrease in the fishing mortality. Then the increased cod stock would restrict herring and sprat stock remarkably, and harvesting of these species would not be as profitable anymore.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Cambodia, water has a special purpose as a source of life and livelihoods. Along with agriculture, fishing and forest use, industry, hydropower, navigation and tourism compete for the water resources. When rights and responsibilities related to essential and movable water are unclear, conflicts emerge easily. Therefore, water management is needed in order to plan and control the use of water resources. The international context is characterized by the Mekong River that flows through six countries. All of the countries by the river have very different roles and interests already depending on their geographical location. At the same time, water is also a tool for cooperation and peace. Locally, the water resources and related livelihoods create base for well-being, for economical and human resources in particular. They in turn are essential for the local people to participate and defend their rights to water use. They also help to construct the resource base of the state administration. Cambodia is highly dependent on the Mekong River. However, Cambodia has a volatile history whose effects can be seen for example in population structure, once suspended public institutions and weakened trust in the society. Relatively stable conditions came to the country as late as in the 1990s, therefore Cambodia for example has a weak status within the Mekong countries. This Master s thesis forms international, national and local interest groups of water use and analyzes their power relations and resources to affect water management. The state is seen as the salient actor as it has the formal responsibility of the water resources and of the coordination between the actions of different levels. In terms of water use this study focuses on production, in management on planning and in power relations on the resources. Water resources of Cambodia are seen consisting of the Mekong River and Tonle Sap Lake and the time span of the study is between the years 1991 and 2006. The material consists of semi-structured interviews collected during summer 2006 in Finland and in Cambodia as well as of literature and earlier studies. The results of the study show that the central state has difficulties to coordinate the actions of different actors because of its resource deficit and internal conflicts. The lessons of history and the vested interests of the actors of the state make it difficult to plan and to strengthen legislation. It seems that the most needed resources at the central state level are intangible as at the village level instead, the tangible resources (fulfilling the basic needs) are primarily important. The local decision-making bodies, NGOs and private sector mainly require legislation and legitimacy to support their role. However, the civil society and the international supporters are active and there are possibilities for new cooperation networks. Keywords: Water management, resources, participation, Cambodia, Mekong

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Beavers are often found to be in conflict with human interests by creating nuisances like building dams on flowing water (leading to flooding), blocking irrigation canals, cutting down timbers, etc. At the same time they contribute to raising water tables, increased vegetation, etc. Consequently, maintaining an optimal beaver population is beneficial. Because of their diffusion externality (due to migratory nature), strategies based on lumped parameter models are often ineffective. Using a distributed parameter model for beaver population that accounts for their spatial and temporal behavior, an optimal control (trapping) strategy is presented in this paper that leads to a desired distribution of the animal density in a region in the long run. The optimal control solution presented, imbeds the solution for a large number of initial conditions (i.e., it has a feedback form), which is otherwise nontrivial to obtain. The solution obtained can be used in real-time by a nonexpert in control theory since it involves only using the neural networks trained offline. Proper orthogonal decomposition-based basis function design followed by their use in a Galerkin projection has been incorporated in the solution process as a model reduction technique. Optimal solutions are obtained through a "single network adaptive critic" (SNAC) neural-network architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management of large projects, especially the ones in which a major component of R&D is involved and those requiring knowledge from diverse specialised and sophisticated fields, may be classified as semi-structured problems. In these problems, there is some knowledge about the nature of the work involved, but there are also uncertainties associated with emerging technologies. In order to draw up a plan and schedule of activities of such a large and complex project, the project manager is faced with a host of complex decisions that he has to take, such as, when to start an activity, for how long the activity is likely to continue, etc. An Intelligent Decision Support System (IDSS) which aids the manager in decision making and drawing up a feasible schedule of activities while taking into consideration the constraints of resources and time, will have a considerable impact on the efficient management of the project. This report discusses the design of an IDSS that helps in project planning phase through the scheduling phase. The IDSS uses a new project scheduling tool, the Project Influence Graph (PIG).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of different wireless networks, such as GSM and WiFi, as a two-tier hybrid wireless network is more popular and economical. Efficient bandwidth management, call admission control strategies and mobility management are important issues in supporting multiple types of services with different bandwidth requirements in hybrid networks. In particular, bandwidth is a critical commodity because of the type of transactions supported by these hybrid networks, which may have varying bandwidth and time requirements. In this paper, we consider such a problem in a hybrid wireless network installed in a superstore environment and design a bandwidth management algorithm based on the priority level, classification of the incoming transactions. Our scheme uses a downlink transaction scheduling algorithm, which decides how to schedule the outgoing transactions based on their priority level with efficient use of available bandwidth. The transaction scheduling algorithm is used to maximize the number of transaction-executions. The proposed scheme is simulated in a superstore environment with multi Rooms. The performance results describe that the proposed scheme can considerably improve the bandwidth utilization by reducing transaction blocking and accommodating more essential transactions at the peak time of the business.