207 resultados para quasi-copulas


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Blood for transfusion may become contaminated at any point between collection and transfusion and may result in bacteraemia (the presence of bacteria in the blood),severe illness or even death for the blood recipient. Donor arm skin is one potential source of blood contamination, so it is usual to cleanse the skin with an antiseptic before blood donation. One-step and two-step alcohol based antiseptic regimens are both commonly advocated but there is uncertainty as to which is most effective.----- Objectives: To assess the effects of cleansing the skin of blood donors with alcohol in a one-step compared with alcohol in a two-step procedure to prevent contamination of collected blood or bacteraemia in the recipient.----- Search strategy: We searched the Cochrane Wounds Group Specialised Register (March 10 2009); The Cochrane Central Register of Controlled Trials(CENTRAL) The Cochrane Library 2009, Issue 1; Ovid MEDLINE - (1950 to February Week 4 2009); Ovid EMBASE - (1980 to 2009 Week 9); and EBSCO CINAHL - (1982 to February Week 4 2009). We also searched the reference lists of key papers.----- Selection criteria: All randomised trials (RCTs) comparing alcohol based donor skin cleansing in a one-step versus a two-step process that includes alcohol and any other antiseptic for pre-venepuncture skin cleansing were considered. Quasi randomised trials were to have been considered in the absence of RCTs.----- Data collection and analysis: Two review authors independently assessed studies for inclusion.----- Main results: No studies (RCTs or quasi RCTs) met the inclusion criteria. Authors’ conclusions We did not identify any eligible studies for inclusion in this review. It is therefore unclear whether a two-step, alcohol followed by antiseptic skin cleansing process prior to blood donation confers any reduction in the risk of blood contamination or bacteraemia in blood recipients, or conversely whether a one-step process increases risk above that associated with a two-step process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Ecological data sets often use clustered measurements or use repeated sampling in a longitudinal design. Choosing the correct covariance structure is an important step in the analysis of such data, as the covariance describes the degree of similarity among the repeated observations. 2. Three methods for choosing the covariance are: the Akaike information criterion (AIC), the quasi-information criterion (QIC), and the deviance information criterion (DIC). We compared the methods using a simulation study and using a data set that explored effects of forest fragmentation on avian species richness over 15 years. 3. The overall success was 80.6% for the AIC, 29.4% for the QIC and 81.6% for the DIC. For the forest fragmentation study the AIC and DIC selected the unstructured covariance, whereas the QIC selected the simpler autoregressive covariance. Graphical diagnostics suggested that the unstructured covariance was probably correct. 4. We recommend using DIC for selecting the correct covariance structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background In many clinical areas, integrated care pathways are utilised as structured multidisciplinary care plans which detail essential steps in caring for patients with specific clinical problems. Particularly, care pathways for the dying have been developed as a model to improve the end-of-life care of all patients. They aim to ensure that the most appropriate management occurs at the most appropriate time and that it is provided by the most appropriate health professional. Clinical pathways for end-of-life care management are used widely around the world and have been regarded as the gold standard. Therefore, there is a significant need for clinicians to be informed about the utilisation of end-of-life care pathways with a systematic review. Objectives To assess the effects of end-of-life care pathways, compared with usual care (no pathway) or with care guided by another end-of-life care pathway across all healthcare settings (e.g. hospitals, residential aged care facilities, community). Search strategy The Cochrane Register of controlled Trials (CENTRAL), the Pain, Palliative and Supportive Care Review group specialised register,MEDLINE, EMBASE, review articles and reference lists of relevant articles were searched. The search was carried out in September 2009. Selection criteria All randomised controlled trials (RCTs), quasi-randomised trial or high quality controlled before and after studies comparing use versus non-use of an end-of-life care pathway in caring for the dying. Data collection and analysis Results of searches were reviewed against the pre-determined criteria for inclusion by two review authors. Main results The search identified 920 potentially relevant titles, but no studies met criteria for inclusion in the review. Authors’ conclusions Without further available evidence, recommendations for the use of end-of-life pathways in caring for the dying cannot be made. RCTs or other well designed controlled studies are needed for evaluating the use of end-of-life care pathways in caring for dying people.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis argues that the end of Soviet Marxism and a bipolar global political imaginary at the dissolution of the short Twentieth Century poses an obstacle for anti-systemic political action. Such a blockage of alternate political imaginaries can be discerned by reading the work of Francis Fukuyama and "Endism" as performative invocations of the closure of political alternatives, and thus as an ideological proclamation which enables and constrains forms of social action. It is contended that the search through dialectical thought for a competing universal to posit against "liberal democracy" is a fruitless one, because it reinscribes the terms of teleological theories of history which work to effect closure. Rather, constructing a phenomenological analytic of the political conjuncture, the thesis suggests that the figure of messianism without a Messiah is central to a deconstructive reframing of the possibilities of political action - a reframing attentive to the rhetorical tone of texts. The project of recovering the political is viewed through a phenomenological lens. An agonistic political distinction must be made so as to memorialise the remainders and ghosts of progress, and thus to gesture towards an indeconstructible justice which would serve as a horizon for the articulation of an empty universal. This project is furthered by a return to a certain phenomenology inspired by Cornelius Castoriadis, Claude Lefort, Maurice Merleau-Ponty and Ernesto Laclau. The thesis provides a reading of Jacques Derrida and Walter Benjamin as thinkers of a minor universalism, a non-prescriptive utopia, and places their work in the context of new understandings of religion and the political as quasi-transcendentals which can be utilised to think through the aporias of political time in order to grasp shards of meaning. Derrida and Chantal Mouffe's deconstructive critique and supplement to Carl Schmitt's concept of the political is read as suggestive of a reframing of political thought which would leave the political question open and thus enable the articulation of social imaginary significations able to inscribe meaning in the field of political action. Thus, the thesis gestures towards a form of thought which enables rather than constrains action under the sign of justice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Communication is one team process factor that has received considerable research attention in the team literature. This literature provides equivocal evidence regarding the role of communication in team performance and yet, does not provide any evidence for when communication becomes important for team performance. This research program sought to address this evidence gap by a) testing task complexity and team member diversity (race diversity, gender diversity and work value diversity) as moderators of the team communication — performance relationship; and b) testing a team communication — performance model using established teams across two different task types. The functional perspective was used as the theoretical framework for operationalizing team communication activity. The research program utilised a quasi-experimental research design with participants from a large multi-national information technology company whose Head Office was based in Sydney, Australia. Participants voluntarily completed two team building exercises (a decision making and production task), and completed two online questionnaires. In total, data were collected from 1039 individuals who constituted 203 work teams. Analysis of the data revealed a small number of significant moderation effects, not all in the expected direction. However, an interesting and unexpected finding also emerged from Study One. Large and significant correlations between communication activity ratings were found across tasks, but not within tasks. This finding suggested that teams were displaying very similar profiles of communication on each task, despite the tasks having different communication requirements. Given this finding, Study Two sought to a) determine the relative importance of task versus team effects in explaining variance in team communication measures for established teams; b) determine if established teams had reliable and discernable team communication profiles and if so, c) investigate whether team communication profiles related to task performance. Multi-level modeling and repeated measures analysis of variance (ANOVA) revealed that task type did not have an effect on team communication ratings. However, teams accounted for 24% of the total variance in communication measures. Through cluster analysis, five reliable and distinct team communication profiles were identified. Consistent with the findings of the multi-level analysis and repeated measures ANOVA, teams’ profiles were virtually identical across the decision making and production tasks. A relationship between communication profile and performance was identified for the production task, although not for the decision making task. This research responds to calls in the literature for a better understanding of when communication becomes important for team performance. The moderators tested in this research were not found to have a substantive or reliable effect on the relationship between communication and performance. However, the consistency in team communication activity suggests that established teams can be characterized by their communication profiles and further, that these communication profiles may have implications for team performance. The findings of this research provide theoretical support for the functional perspective in terms of the communication – performance relationship and further support the team development literature as an explanation for the stability in team communication profiles. This research can also assist organizations to better understand the specific types of communication activity and profiles of communication that could offer teams a performance advantage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low back pain is an increasing problem in industrialised countries and although it is a major socio-economic problem in terms of medical costs and lost productivity, relatively little is known about the processes underlying the development of the condition. This is in part due to the complex interactions between bone, muscle, nerves and other soft tissues of the spine, and the fact that direct observation and/or measurement of the human spine is not possible using non-invasive techniques. Biomechanical models have been used extensively to estimate the forces and moments experienced by the spine. These models provide a means of estimating the internal parameters which can not be measured directly. However, application of most of the models currently available is restricted to tasks resembling those for which the model was designed due to the simplified representation of the anatomy. The aim of this research was to develop a biomechanical model to investigate the changes in forces and moments which are induced by muscle injury. In order to accurately simulate muscle injuries a detailed quasi-static three dimensional model representing the anatomy of the lumbar spine was developed. This model includes the nine major force generating muscles of the region (erector spinae, comprising the longissimus thoracis and iliocostalis lumborum; multifidus; quadratus lumborum; latissimus dorsi; transverse abdominis; internal oblique and external oblique), as well as the thoracolumbar fascia through which the transverse abdominis and parts of the internal oblique and latissimus dorsi muscles attach to the spine. The muscles included in the model have been represented using 170 muscle fascicles each having their own force generating characteristics and lines of action. Particular attention has been paid to ensuring the muscle lines of action are anatomically realistic, particularly for muscles which have broad attachments (e.g. internal and external obliques), muscles which attach to the spine via the thoracolumbar fascia (e.g. transverse abdominis), and muscles whose paths are altered by bony constraints such as the rib cage (e.g. iliocostalis lumborum pars thoracis and parts of the longissimus thoracis pars thoracis). In this endeavour, a separate sub-model which accounts for the shape of the torso by modelling it as a series of ellipses has been developed to model the lines of action of the oblique muscles. Likewise, a separate sub-model of the thoracolumbar fascia has also been developed which accounts for the middle and posterior layers of the fascia, and ensures that the line of action of the posterior layer is related to the size and shape of the erector spinae muscle. Published muscle activation data are used to enable the model to predict the maximum forces and moments that may be generated by the muscles. These predictions are validated against published experimental studies reporting maximum isometric moments for a variety of exertions. The model performs well for fiexion, extension and lateral bend exertions, but underpredicts the axial twist moments that may be developed. This discrepancy is most likely the result of differences between the experimental methodology and the modelled task. The application of the model is illustrated using examples of muscle injuries created by surgical procedures. The three examples used represent a posterior surgical approach to the spine, an anterior approach to the spine and uni-lateral total hip replacement surgery. Although the three examples simulate different muscle injuries, all demonstrate the production of significant asymmetrical moments and/or reduced joint compression following surgical intervention. This result has implications for patient rehabilitation and the potential for further injury to the spine. The development and application of the model has highlighted a number of areas where current knowledge is deficient. These include muscle activation levels for tasks in postures other than upright standing, changes in spinal kinematics following surgical procedures such as spinal fusion or fixation, and a general lack of understanding of how the body adjusts to muscle injuries with respect to muscle activation patterns and levels, rate of recovery from temporary injuries and compensatory actions by other muscles. Thus the comprehensive and innovative anatomical model which has been developed not only provides a tool to predict the forces and moments experienced by the intervertebral joints of the spine, but also highlights areas where further clinical research is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In situ near-IR transmittance measurements have been used to characterize the density of trapped electrons in dye-sensitized solar cells (DSCs). Measurements have been made under a range experimental conditions including during open circuit photovoltage decay and during recording of the IV characteristic. The optical cross section of electrons at 940 nm was determined by relating the IR absorbance to the density of trapped electrons measured by charge extraction. The value, σn = 5.4 × 10-18 cm2, was used to compare the trapped electron densities in illuminated DSCs under open and short circuit conditions in order to quantify the difference in the quasi Fermi level, nEF. It was found that nEF for the cells studied was 250 meV over wide range of illuminat on intensities. IR transmittance measurements have also been used to quantify shifts in conduction band energy associated with dye adsorption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study sought to establish and develop innovative instructional procedures, in which scaffolding can be expanded and applied, in order to enhance learning of English as a Foreign Language (EFL) writing skills in an effective hybrid learning community (a combination of face-to-face and online modes of learning) at the university where the researcher is working. Many educational experts still believe that technology has not been harnessed to its potential to meet the new online characteristics and trends. There is also an urgency to reconsider the pedagogical perspectives involved in the utilisation of online learning systems in general and the social interactions within online courses in particular that have been neglected to date. An action research design, conducted in two cycles within a duration of four months, was utilised throughout this study. It was intended not only to achieve a paradigm shift from transmission-absorption to socio-constructivist teaching/learning methodologies but also to inform practice in these technology-rich environments. Five major findings emerged from the study. First, the scaffolding theory has been extended. Two new scaffolding types (i.e., quasi-transcendental scaffolding and transcendental scafolding), two scaffolding aspects (i.e., receptive and productive) and some scaffolding actions (e.g., providing a stimulus, awareness, reminder, or remedy) for EFL writing skills in an effective hybrid learning community have been identified and elaborated on. Second, the EFL ‘Effective Writing’ students used the scaffolds implemented in a hybrid environment to enhance and enrich their learning of writing of English essays. The online activities, conducted after the F2F sessions most of the time, gave students greater opportunities to both reinforce and expand the knowledge they had acquired in the F2F mode. Third, a variety of teaching techniques, different online tasks and discussion topics utilised in the two modes bolstered the students’ interests and engagement in their knowledge construction of how to compose English-language essays. Fourth, through the scaffolded activities, the students learned how to scaffold themselves and thus became independent learners in their future endeavours of constructing knowledge. Fifth, the scaffolding-to-scaffold activities provided the students with knowledge on how to effectively engage in transcendental scaffolding actions and facilitate the learning of English writing skills by less able peers within the learning community. Thus, the findings of this current study extended earlier understandings of scaffolding in an EFL hybrid learning environment and will contribute to the advancement of future ICT-mediated courses in terms of their scaffolding pedagogical aspects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sinking of the Titanic in April 1912 took the lives of 68 percent of the people aboard. Who survived? It was women and children who had a higher probability of being saved, not men. Likewise, people traveling in first class had a better chance of survival than those in second and third class. British passengers were more likely to perish than members of other nations. This extreme event represents a rare case of a well-documented life and death situation where social norms were enforced. This paper shows that economic analysis can account for human behavior in such situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper seeks to identify what antecedents of power make it more or less likely for people to survive in a life-threatening situation.In particular, we look at the Titanic disaster as the life or death situation. Maritime disasters can be interpreted as quasi-natural experiments because every person is affected by the shock. True human nature becomes apparent in such a dangerous situation. Five antecedents of power are distinguished: physical strength, economic resources, nationality, social and moral factors. This empirical analysis supports the notion that power is a key determinant in extreme situations of life or death.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter summarizes a quasi-ethnographic case study of the lives and work of nine native-speaking English language teachers who have lived and worked outside their countries of origin for extended periods. The study aimed to document the complexity of ELT as ‘work’ in new global economic and cultural conditions, and to explore how this complexity is realised in the everyday experiences of ELT teachers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we examine the design of business process diagrams in contexts where novice analysts only have basic design tools such as paper and pencils available, and little to no understanding of formalized modeling approaches. Based on a quasi-experimental study with 89 BPM students, we identify five distinct process design archetypes ranging from textual to hybrid, and graphical representation forms. We also examine the quality of the designs and identify which representation formats enable an analyst to articulate business rules, states, events, activities, temporal and geospatial information in a process model. We found that the quality of the process designs decreases with the increased use of graphics and that hybrid designs featuring appropriate text labels and abstract graphical forms are well-suited to describe business processes. Our research has implications for practical process design work in industry as well as for academic curricula on process design.