824 resultados para Probabilistic decision process model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explored mental toughness in soccer using a triangulation of data capture involving players (n = 6), coaches (n = 4), and parents (n = 5). Semi-structured interviews, based on a personal construct psychology (Kelly, 1955/1991) framework, were conducted to elicit participants' perspectives on the key characteristics and their contrasts, situations demanding mental toughness, and the behaviours displayed and cognitions employed by mentally tough soccer players. The results from the research provided further evidence that mental toughness is conceptually distinct from other psychological constructs such as hardiness. The findings also supported Gucciardi, Gordon, and Dimmock's (2009) process model of mental toughness. A winning mentality and desire was identified as a key attribute of mentally tough soccer players in addition to other previously reported qualities such as self-belief, physical toughness, work ethic/motivation, and resilience. Key cognitions reported by mentally tough soccer players enabled them to remain focused and competitive during training and matches and highlighted the adoption of several forms of self-talk in dealing with challenging situations. Minor revisions to Gucciardi and colleagues' definition of mental toughness are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution. Given a business process model and an event log, the method returns a set of statements in natural language describing the behavior allowed by the process model but not observed in the log and vice versa. The method relies on a unified representation of process models and event logs based on a well-known model of concurrency, namely event structures. Specifically, the problem of conformance checking is approached by folding the input event log into an event structure, unfolding the process model into another event structure, and comparing the two event structures via an error-correcting synchronized product. Each behavioral difference detected in the synchronized product is then verbalized as a natural language statement. An empirical evaluation shows that the proposed method scales up to real-life datasets while producing more concise and higher-level difference descriptions than state-of-the-art conformance checking methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project is to bring information on low chill stonefruit varieties to a user in a clear and friendly format to aid in that decision process. Low Chill Australia see this project as high priority for its members to be competitive by growing high quality, early season peach and nectarine fruit varieties. Data will be collated from grower surveys, breeder’s descriptions and literature, and entered into an Access Database and published on the web for stonefruit growers in tropical and sub-tropical regions across Australia. Links will be available from the Low Chill Australia and Summerfruit Australia websites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current research proposed a conceptual design framework for airports to obtain flexible departure layouts based on passenger activity analysis obtained from Business Process Models (BPM). BPMs available for airport terminals were used as a design tool in the current research to uncover the relationships existing between spatial layout and corresponding passenger activities. An algorithm has been developed that demonstrates the applicability of the proposed design framework by obtaining relative spatial layouts based on passenger activity analysis. The generated relative spatial layout assists architects in achieving suitable alternative layouts to meet the changing needs of an airport terminal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing major infrastructure and construction (MIC) projects is complicated, since it involves multifaceted policy issues. As a result, appropriate participatory mechanisms have been increasingly employed to improve the legitimacy of the project decision process. Yet it cannot always guarantee a mutually acceptable solution since the expectations and requirements of multiple stakeholders involved can be diverse and even conflicting. Overcoming this necessitates a thorough identification and careful analysis of the expectations of various stakeholder groups in MIC projects. On the other hand, though most project stakeholder concerns are consistent across the globe, contextual differences may lead to diverse priority levels being attached to these factors. This research, therefore, aimed to examine the perceptual differences between paired stakeholder groups from mainland China mega-cities and Hong Kong in rating their concerns over MIC projects. The research findings are expected to benefit both the Central Government of China and the Government of Hong Kong SAR for coping better with the rapid expansion of MIC projects in the territory and the increasing expectations of social equality, and therefore achieving the much desired harmonious development of the community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimal Punishment of Economic Crime: A Study on Bankruptcy Crime This thesis researches whether the punishment practise of bankruptcy crimes is optimal in light of Gary S. Becker’s theory of optimal punishment. According to Becker, a punishment is optimal if it eliminates the expected utility of the crime for the offender and - on the other hand - minimizes the cost of the crime to society. The decision process of the offender is observed through their expected utility of the crime. The expected utility is calculated based on the offender's probability of getting caught, the cost of getting caught and the profit from the crime. All objects including the punishment are measured in cash. The cost of crimes to the society is observed defining the disutility caused by the crime to the society. The disutility is calculated based on the cost of crime prevention, crime damages, punishment execution and the probability of getting caught. If the goal is to minimize the crime profits, the punishments of bankruptcy crimes are not optimal. If the debtors would decide whether or not to commit the crime solely based on economical consideration, the crime rate would be multiple times higher than the current rate is. The prospective offender relies heavily on non-economic aspects in their decision. Most probably social pressure and personal commitment to oblige the laws are major factors in the prospective criminal’s decision-making. The function developed by Becker measuring the cost to society was not useful in the measurement of the optimality of a punishment. The premise of the function that the costs of the society correlate to the costs for the offender from the punishment proves to be unrealistic in observation of the bankruptcy crimes. However, it was observed that majority of the cost of crime for the society are caused by the crime damages. This finding supports the preventive criminal politics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis contains three subject areas concerning particulate matter in urban area air quality: 1) Analysis of the measured concentrations of particulate matter mass concentrations in the Helsinki Metropolitan Area (HMA) in different locations in relation to traffic sources, and at different times of year and day. 2) The evolution of traffic exhaust originated particulate matter number concentrations and sizes in local street scale are studied by a combination of a dispersion model and an aerosol process model. 3) Some situations of high particulate matter concentrations are analysed with regard to their meteorological origins, especially temperature inversion situations, in the HMA and three other European cities. The prediction of the occurrence of meteorological conditions conducive to elevated particulate matter concentrations in the studied cities is examined. The performance of current numerical weather forecasting models in the case of air pollution episode situations is considered. The study of the ambient measurements revealed clear diurnal variation of the PM10 concentrations in the HMA measurement sites, irrespective of the year and the season of the year. The diurnal variation of local vehicular traffic flows seemed to have no substantial correlation with the PM2.5 concentrations, indicating that the PM10 concentrations were originated mainly from local vehicular traffic (direct emissions and suspension), while the PM2.5 concentrations were mostly of regionally and long-range transported origin. The modelling study of traffic exhaust dispersion and transformation showed that the number concentrations of particles originating from street traffic exhaust undergo a substantial change during the first tens of seconds after being emitted from the vehicle tailpipe. The dilution process was shown to dominate total number concentrations. Minimal effect of both condensation and coagulation was seen in the Aitken mode number concentrations. The included air pollution episodes were chosen on the basis of occurrence in either winter or spring, and having at least partly local origin. In the HMA, air pollution episodes were shown to be linked to predominantly stable atmospheric conditions with high atmospheric pressure and low wind speeds in conjunction with relatively low ambient temperatures. For the other European cities studied, the best meteorological predictors for the elevated concentrations of PM10 were shown to be temporal (hourly) evolutions of temperature inversions, stable atmospheric stability and in some cases, wind speed. Concerning the weather prediction during particulate matter related air pollution episodes, the use of the studied models were found to overpredict pollutant dispersion, leading to underprediction of pollutant concentration levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Action, Power and Experience in Organizational Change - A Study of Three Major Corporations This study explores change management and resistance to change as social activities and power displays through worker experiences in three major Finnish corporations. Two important sensitizing concepts were applied. Firstly, Richard Sennett's perspective on work in the new form of capitalism, and its shortcomings - the lack of commitment and freedom accompanied by the disruption to lifelong career planning and the feeling of job insecurity - offered a fruitful starting point for a critical study. Secondly, Michel Foucault's classical concept of power, treated as anecdotal, interactive and nonmeasurable, provided tools for analyzing change-enabling and resisting acts. The study bridges the gap between management and social sciences. The former have usually concentrated on leadership issues, best practices and goal attainment, while the latter have covered worker experiences, power relations and political conflicts. The study was motivated by three research questions. Firstly, why people resist or support changes in their work, work environment or organization, and the kind of analyses these behavioural choices are based on. Secondly, the kind of practical forms which support for, and resistance to change take, and how people choose the different ways of acting. Thirdly, how the people involved experience and describe their own subject position and actions in changing environments. The examination focuses on practical interpretations and action descriptions given by the members of three major Finnish business organizations. The empirical data was collected during a two-year period in the Finnish Post Corporation, the Finnish branch of Vattenfal Group, one of the leading European energy companies, and the Mehiläinen Group, the leading private medical service provider in Finland. It includes 154 non-structured thematic interviews and 309 biographies concentrating on personal experiences of change. All positions and organizational levels were represented. The analysis was conducted using the grounded theory method introduced by Straus and Corbin in three sequential phases, including open, axial and selective coding processes. As a result, there is a hierarchical structure of categories, which is summarized in the process model of change behaviour patterns. Key ingredients are past experiences and future expectations which lead to different change relations and behavioural roles. Ultimately, they contribute to strategic and tactical choices realized as both public and hidden forms of action. The same forms of action can be used in both supporting and resisting change, and there are no specific dividing lines either between employer and employee roles or between different hierarchical positions. In general, however, it is possible to conclude that strategic choices lead more often to public forms of action, whereas tactical choices result in hidden forms. The primary goal of the study was to provide knowledge which has practical applications in everyday business life, HR and change management. The results, therefore, are highly applicable to other organizations as well as to less change-dominated situations, whenever power relations and conflicting interests are present. A sociological thesis on classical business management issues can be of considerable value in revealing the crucial social processes behind behavioural patterns. Keywords: change management, organizational development, organizational resistance, resistance to change, change management, labor relations, organization, leadership

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marja Heinonen s dissertation Verkkomedian käyttö ja tutkiminen. Iltalehti Online 1995-2001 describes the usage of new internet based news service Iltalehti Online during its first years of existence, 1995-2001. The study focuses on the content of the service and users attitudes towards the new media and its contents. Heinonen has also analyzed and described the research methods that can be used in the research of any new media phenomenon when there is no historical perspective to do the research. Heinonen has created a process model for the research of net medium, which is based on a multidimensional approach. She has chosen an iterative research method inspired by Sudweeks and Simoff s CEDA-methodology in which qualitative and quantitative methods take turns both creating results and new research questions. The dissertation discusses and describes the possibilities of combining several research methods in the study of online news media. On general level it discusses the methodological possibilities of researching a completely new media form when there is no historical perspective. The result of these discussions is in favour for the multidimensional methods. The empiric research was built around three cases of Iltalehti Online among its users: log analysis 1996-1999, interviews 1999 and clustering 2000-2001. Even though the results of different cases were somewhat conflicting here are the central results from the analysis of Iltalehti Online 1995-2001: - Reading was strongly determined by the gender. - The structure of Iltalehti Online guided the reading strongly. - People did not make a clear distinction in content between news and entertainment. - Users created new habits in their everyday life during the first years of using Iltalehti Online. These habits were categorized as follows: - break between everyday routines - established habit - new practice within the rhythm of the day - In the clustering of the users sports, culture and celebrities were the most distinguishing contents. Users did not move across these borders as much as within them. The dissertation gives contribution to the development of multidimensional research methods in the field of emerging phenomena in media field. It is also a unique description of a phase of development in media history through an unique research material. There is no such information (logs + demographics) available of any other Finnish online news media. Either from the first years or today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Government of India has announced the Greening India Mission (GIM) under the National Climate Change Action Plan. The Mission aims to restore and afforest about 10 mha over the period 2010-2020 under different sub-missions covering moderately dense and open forests, scrub/grasslands, mangroves, wetlands, croplands and urban areas. Even though the main focus of the Mission is to address mitigation and adaptation aspects in the context of climate change, the adaptation component is inadequately addressed. There is a need for increased scientific input in the preparation of the Mission. The mitigation potential is estimated by simply multiplying global default biomass growth rate values and area. It is incomplete as it does not include all the carbon pools, phasing, differing growth rates, etc. The mitigation potential estimated using the Comprehensive Mitigation Analysis Process model for the GIM for the year 2020 has the potential to offset 6.4% of the projected national greenhouse gas emissions, compared to the GIM estimate of only 1.5%, excluding any emissions due to harvesting or disturbances. The selection of potential locations for different interventions and species choice under the GIM must be based on the use of modelling, remote sensing and field studies. The forest sector provides an opportunity to promote mitigation and adaptation synergy, which is not adequately addressed in the GIM. Since many of the interventions proposed are innovative and limited scientific knowledge exists, there is need for an unprecedented level of collaboration between the research institutions and the implementing agencies such as the Forest Departments, which is currently non-existent. The GIM could propel systematic research into forestry and climate change issues and thereby provide global leadership in this new and emerging science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background—Mutations of the APC gene cause familial adenomatous polyposis (FAP), a hereditary colorectal cancer predisposition syndrome.Aims—To conduct a cost comparison analysis of predictive genetic testing versus conventional clinical screening for individuals at risk of inheriting FAP, using the perspective of a third party payer. Methods—All direct health care costs for both screening strategies were measured according to time and motion, and the expected costs evaluated using a decision analysis model.Results—The baseline analysis predicted that screening a prototype FAP family would cost $4975/£3109 by molecular testingand $8031/£5019 by clinical screening strategy, when family members were monitored with the same frequency of clinical surveillance (every two to three years). Sensitivity analyses revealed that the genetic testing approach is cost saving for key variables including the kindred size, the age of screening onset, and the cost of mutation identification in a proband. However, if the APC mutation carriers were monitored at an increased (annual) frequency, the cost of the genetic screening strategy increased to $7483/ £4677 and was especially sensitive to variability in age of onset of screening, family size, and cost of genetic testing of at risk relatives. Conclusions—In FAP kindreds, a predictive genetic testing strategy costs less than conventional clinical screening, provided that the frequency of surveillance is identical using either strategy. An additional significant benefit is the elimination of unnecessary colonic examinations for those family members found to be noncarriers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A single-source network is said to be memory-free if all of the internal nodes (those except the source and the sinks) do not employ memory but merely send linear combinations of the incoming symbols (received at their incoming edges) on their outgoing edges. Memory-free networks with delay using network coding are forced to do inter-generation network coding, as a result of which the problem of some or all sinks requiring a large amount of memory for decoding is faced. In this work, we address this problem by utilizing memory elements at the internal nodes of the network also, which results in the reduction of the number of memory elements used at the sinks. We give an algorithm which employs memory at all the nodes of the network to achieve single- generation network coding. For fixed latency, our algorithm reduces the total number of memory elements used in the network to achieve single- generation network coding. We also discuss the advantages of employing single-generation network coding together with convolutional network-error correction codes (CNECCs) for networks with unit- delay and illustrate the performance gain of CNECCs by using memory at the intermediate nodes using simulations on an example network under a probabilistic network error model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a small extent sensor network for event detection, in which nodes periodically take samples and then contend over a random access network to transmit their measurement packets to the fusion center. We consider two procedures at the fusion center for processing the measurements. The Bayesian setting, is assumed, that is, the fusion center has a prior distribution on the change time. In the first procedure, the decision algorithm at the fusion center is network-oblivious and makes a decision only when a complete vector of measurements taken at a sampling instant is available. In the second procedure, the decision algorithm at the fusion center is network-aware and processes measurements as they arrive, but in a time-causal order. In this case, the decision statistic depends on the network delays, whereas in the network-oblivious case, the decision statistic does not. This yields a Bayesian change-detection problem with a trade-off between the random network delay and the decision delay that is, a higher sampling rate reduces the decision delay but increases the random access delay. Under periodic sampling, in the network-oblivious case, the structure of the optimal stopping rule is the same as that without the network, and the optimal change detection delay decouples into the network delay and the optimal decision delay without the network. In the network-aware case, the optimal stopping problem is analyzed as a partially observable Markov decision process, in which the states of the queues and delays in the network need to be maintained. A sufficient decision statistic is the network state and the posterior probability of change having occurred, given the measurements received and the state of the network. The optimal regimes are studied using simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study optimal control of Markov processes with age-dependent transition rates. The control policy is chosen continuously over time based on the state of the process and its age. We study infinite horizon discounted cost and infinite horizon average cost problems. Our approach is via the construction of an equivalent semi-Markov decision process. We characterise the value function and optimal controls for both discounted and average cost cases.