956 resultados para time-and-material contract


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pratylenchus thornei is a major pathogen of wheat in Australia. Two glasshouse experiments with four wheat cultivars that had different final populations (Pf) of P. thornei in the field were used to optimise conditions for assessing resistance. With different initial populations (Pi) ranging up to 5250 P. thornei/kg soil, Pf of P. thornei increased to 16 weeks after sowing, and then decreased at 20 weeks in some cultivar x Pi combinations. The population dynamics of P. thornei up to 16 weeks were best described by a modified exponential equation P f (t) = aP i e kt where P f (t) is the final population density at time t, P i is the initial population density, a is the proportion of P i that initiates population development, and k is the intrinsic rate of increase of the population. The cultivar GS50a had very low k values at Pi of 5250 and 1050 indicating its resistance, Suneca and Potam had high k values indicating susceptibility, whereas intolerant Gatcher had a low value at the higher Pi and a high value at the lower Pi. Nitrate fertiliser increased plant growth and Pf values of susceptible cultivars, but in unplanted soil it decreased Pf. Nematicide (aldicarb 5 mg/kg soil) killed P. thornei more effectively in planted than in unplanted soil and increased plant growth particularly in the presence of N fertiliser. In both experiments, the wheat cultivars Suneca and Potam were more susceptible than the cultivar GS50a reflecting field results. The method chosen to discriminate wheat cultivars was to assess Pf after growth for 16 weeks in soil with Pi ~1050–5250 P. thornei/kg soil and fertilised with 200 mg NO3–N/kg soil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Project archives are becoming increasingly large and complex. On construction projects in particular, the increasing amount of information and the increasing complexity of its structure make searching and exploring information in the project archive challenging and time-consuming. Methods This research investigates a query-driven approach that represents new forms of contextual information to help users understand the set of documents resulting from queries of construction project archives. Specifically, this research extends query-driven interface research by representing three types of contextual information: (1) the temporal context is represented in the form of a timeline to show when each document was created; (2) the search-relevance context shows exactly which of the entered keywords matched each document; and (3) the usage context shows which project participants have accessed or modified a file. Results We implemented and tested these ideas within a prototype query-driven interface we call VisArchive. VisArchive employs a combination of multi-scale and multi-dimensional timelines, color-coded stacked bar charts, additional supporting visual cues and filters to support searching and exploring historical project archives. The timeline-based interface integrates three interactive timelines as focus + context visualizations. Conclusions The feasibility of using these visual design principles is tested in two types of project archives: searching construction project archives of an educational building project and tracking of software defects in the Mozilla Thunderbird project. These case studies demonstrate the applicability, usefulness and generality of the design principles implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: A paradigm shift in educational policy to create problem solvers and critical thinkers produced the games concept approach (GCA) in Singapore's Revised Syllabus for Physical Education (1999). A pilot study (2001) conducted on 11 primary school student teachers (STs) using this approach identified time management and questioning as two of the major challenges faced by novice teachers. Purpose: To examine the GCA from three perspectives: structure—lesson form in terms of teacher-time and pupil-time; product—how STs used those time fractions; and process—the nature of their questioning (type, timing, and target). Participants and setting: Forty-nine STs from three different PETE cohorts (two-year diploma, four-year degree, two-year post-graduate diploma) volunteered to participate in the study conducted during the penultimate week of their final practicum in public primary and secondary schools. Intervention: Based on the findings of the pilot study, PETE increased the emphasis on GCA content specific knowledge and pedagogical procedures. To further support STs learning to actualise the GCA, authentic micro-teaching experiences that were closely monitored by faculty were provided in schools nearby. Research design: This is a descriptive study of time-management and questioning strategies implemented by STs on practicum. Each lesson was segmented into a number of sub-categories of teacher-time (organisation, demonstration and closure) and pupil-time (practice time and game time). Questions were categorised as knowledge, technical, tactical or affective. Data collection: Each ST was video-taped teaching a GCA lesson towards the end of their final practicum. The STs individually determined the timing of the data collection and the lesson to be observed. Data analysis: Each lesson was segmented into a number of sub-categories of both teacher- and pupil-time. Duration recording using Noldus software (Observer 4.0) segmented the time management of different lesson components. Questioning was coded in terms of type, timing and target. Separate MANOVAs were used to measure the difference between programmes and levels (primary and secondary) in relation to time-management procedures and questioning strategies. Findings: No differences emerged between the programmes or levels in their time-management or questioning strategies. Using the GCA, STs generated more pupil time (53%) than teacher time (47%). STs at the primary level provided more technical practice, and those in secondary schools more small-sided game play. Most questions (58%) were asked during play or practice but were substantially low-order involving knowledge or recall (76%) and only 6.7% were open-ended or divergent and capable of developing tactical awareness. Conclusions: Although STs are delivering more pupil time (practice and game) than teacher-time, the lesson structure requires further fine-tuning to extend the practice task beyond technical drills. Many questions are being asked to generate knowledge about games but lack sufficient quality to enhance critical thinking and tactical awareness, as the GCA intends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We conducted two studies to improve our understanding of why and when older workers are focused on learning. Based on socioemotional selectivity theory, which proposes that goal focus changes with age and the perception of time, we hypothesized and found that older workers perceive their remaining time at work as more limited than younger workers which, in turn, is associated with lower learning goal orientation and a less positive attitude toward learning and development. Furthermore, we hypothesized and found that high work centrality buffers the negative association between age and perceived remaining time, and thus the indirect negative effects of age on learning goal orientation and attitude toward learning and development (through perceived remaining time). These findings suggest that scholars and practitioners should take workers’ perceived remaining time and work centrality into account when examining or stimulating learning activities among aging workers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Arguments arising from quantum mechanics and gravitation theory as well as from string theory, indicate that the description of space-time as a continuous manifold is not adequate at very short distances. An important candidate for the description of space-time at such scales is provided by noncommutative space-time where the coordinates are promoted to noncommuting operators. Thus, the study of quantum field theory in noncommutative space-time provides an interesting interface where ordinary field theoretic tools can be used to study the properties of quantum spacetime. The three original publications in this thesis encompass various aspects in the still developing area of noncommutative quantum field theory, ranging from fundamental concepts to model building. One of the key features of noncommutative space-time is the apparent loss of Lorentz invariance that has been addressed in different ways in the literature. One recently developed approach is to eliminate the Lorentz violating effects by integrating over the parameter of noncommutativity. Fundamental properties of such theories are investigated in this thesis. Another issue addressed is model building, which is difficult in the noncommutative setting due to severe restrictions on the possible gauge symmetries imposed by the noncommutativity of the space-time. Possible ways to relieve these restrictions are investigated and applied and a noncommutative version of the Minimal Supersymmetric Standard Model is presented. While putting the results obtained in the three original publications into their proper context, the introductory part of this thesis aims to provide an overview of the present situation in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a scenario in which a wireless sensor network is formed by randomly deploying n sensors to measure some spatial function over a field, with the objective of computing a function of the measurements and communicating it to an operator station. We restrict ourselves to the class of type-threshold functions (as defined in the work of Giridhar and Kumar, 2005), of which max, min, and indicator functions are important examples: our discussions are couched in terms of the max function. We view the problem as one of message-passing distributed computation over a geometric random graph. The network is assumed to be synchronous, and the sensors synchronously measure values and then collaborate to compute and deliver the function computed with these values to the operator station. Computation algorithms differ in (1) the communication topology assumed and (2) the messages that the nodes need to exchange in order to carry out the computation. The focus of our paper is to establish (in probability) scaling laws for the time and energy complexity of the distributed function computation over random wireless networks, under the assumption of centralized contention-free scheduling of packet transmissions. First, without any constraint on the computation algorithm, we establish scaling laws for the computation time and energy expenditure for one-time maximum computation. We show that for an optimal algorithm, the computation time and energy expenditure scale, respectively, as Theta(radicn/log n) and Theta(n) asymptotically as the number of sensors n rarr infin. Second, we analyze the performance of three specific computation algorithms that may be used in specific practical situations, namely, the tree algorithm, multihop transmission, and the Ripple algorithm (a type of gossip algorithm), and obtain scaling laws for the computation time and energy expenditure as n rarr infin. In particular, we show that the computation time for these algorithms scales as Theta(radicn/lo- g n), Theta(n), and Theta(radicn log n), respectively, whereas the energy expended scales as , Theta(n), Theta(radicn/log n), and Theta(radicn log n), respectively. Finally, simulation results are provided to show that our analysis indeed captures the correct scaling. The simulations also yield estimates of the constant multipliers in the scaling laws. Our analyses throughout assume a centralized optimal scheduler, and hence, our results can be viewed as providing bounds for the performance with practical distributed schedulers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A laminated composite plate model based on first order shear deformation theory is implemented using the finite element method.Matrix cracks are introduced into the finite element model by considering changes in the A, B and D matrices of composites. The effects of different boundary conditions, laminate types and ply angles on the behavior of composite plates with matrix cracks are studied.Finally, the effect of material property uncertainty, which is important for composite material on the composite plate, is investigated using Monte Carlo simulations. Probabilistic estimates of damage detection reliability in composite plates are made for static and dynamic measurements. It is found that the effect of uncertainty must be considered for accurate damage detection in composite structures. The estimates of variance obtained for observable system properties due to uncertainty can be used for developing more robust damage detection algorithms. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the emergence of service marketing, the focus of service research has evolved. Currently the focus of research is shifting towards value co-created by the customer. Consequently, value creation is increasingly less fixed to a specific time or location controlled by the service provider. However, present service management models, although acknowledging customer participation and accessibility, have not considered the role of the empowered customer who may perform the service at various locations and time frames. The present study expands this scope and provides a framework for exploring customer perceived value from a temporal and spatial perspective. The framework is used to understand and analyse customer perceived value and to explore customer value profiles. It is proposed that customer perceived value can be conceptualised as a function of technical, functional, temporal and spatial value dimensions. These dimensions are suggested to have value-increasing and value-decreasing facets. This conceptualisation is empirically explored in an online banking context and it is shown that time and location are more important value dimensions relative to the technical and functional dimensions. The findings demonstrate that time and location are important not only in terms of having the possibility to choose when and where the service is performed. Customers also value an efficient and optimised use of time and a private and customised service location. The study demonstrates that time and location are not external elements that form the service context, but service value dimensions, in addition to the technical and functional dimensions. This thesis contributes to existing service management research through its framework for understanding temporal and spatial dimensions of perceived value. Practical implications of the study are that time and location need to be considered as service design elements in order to differentiate the service from other services and create additional value for customers. Also, because of increased customer control and the importance of time and location, it is increasingly relevant for service providers to provide a facilitating arena for customers to create value, rather than trying to control the value creation process. Kristina Heinonen is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the persistent pattern in the Helsinki Exchanges. The persistent pattern is analyzed using a time and a price approach. It is hypothesized that arrival times are related to movements in prices. Thus, the arrival times are defined as durations and formulated as an Autoregressive Conditional Duration (ACD) model as in Engle and Russell (1998). The prices are defined as price changes and formulated as a GARCH process including duration measures. The research question follows from market microstructure predictions about price intensities defined as time between price changes. The microstructure theory states that long transaction durations might be associated with both no news and bad news. Accordingly, short durations would be related to high volatility and long durations to low volatility. As a result, the spread will tend to be larger under intensive moments. The main findings of this study are 1) arrival times are positively autocorrelated and 2) long durations are associated with low volatility in the market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, the possibility of extending the Quantization Condition of Dirac for Magnetic Monopoles to noncommutative space-time is investigated. The three publications that this thesis is based on are all in direct link to this investigation. Noncommutative solitons have been found within certain noncommutative field theories, but it is not known whether they possesses only topological charge or also magnetic charge. This is a consequence of that the noncommutative topological charge need not coincide with the noncommutative magnetic charge, although they are equivalent in the commutative context. The aim of this work is to begin to fill this gap of knowledge. The method of investigation is perturbative and leaves open the question of whether a nonperturbative source for the magnetic monopole can be constructed, although some aspects of such a generalization are indicated. The main result is that while the noncommutative Aharonov-Bohm effect can be formulated in a gauge invariant way, the quantization condition of Dirac is not satisfied in the case of a perturbative source for the point-like magnetic monopole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drug-drug interactions may cause serious, even fatal clinical consequences. Therefore, it is important to examine the interaction potential of new chemical entities early in drug development. Mechanism-based inhibition is a pharmacokinetic interaction type, which causes irreversible loss of enzyme activity and can therefore lead to unusually profound and long-lasting consequences. The in vitro in vivo extrapolation (IVIVE) of drug-drug interactions caused by mechanism-based inhibition is challenging. Consequently, many of these interactions have remained unrecognised for many years. The concomitant use of the fibrate-class lipid-lowering agent gemfibrozil increases the concentrations of some drugs and their effects markedly. Even fatal cases of rhabdomyolysis occurred in patients administering gemfibrozil and cerivastatin concomitantly. One of the main mechanisms behind this effect is the mechanism-based inhibition of the cytochrome P450 (CYP) 2C8 enzyme by a glucuronide metabolite of gemfibrozil leading to increased cerivastatin concentrations. Although the clinical use of gemfibrozil has clearly decreased during recent years, gemfibrozil is still needed in some special cases. To enable safe use of gemfibrozil concomitantly with other drugs, information concerning the time and dose relationships of CYP2C8 inhibition by gemfibrozil should be known. This work was carried out as four in vivo clinical drug-drug interaction studies to examine the time and dose relationships of the mechanism-based inhibitory effect of gemfibrozil on CYP2C8. The oral antidiabetic drug repaglinide was used as a probe drug for measuring CYP2C8 activity in healthy volunteers. In this work, mechanism-based inhibition of the CYP2C8 enzyme by gemfibrozil was found to occur rapidly in humans. The inhibitory effect developed to its maximum already when repaglinide was given 1-3 h after gemfibrozil intake. In addition, the inhibition was shown to abate slowly. A full recovery of CYP2C8 activity, as measured by repaglinide metabolism, was achieved 96 h after cessation of gemfibrozil treatment. The dose-dependency of the mechanism-based inhibition of CYP2C8 by gemfibrozil was shown for the first time in this work. CYP2C8 activity was halved by a single 30 mg dose of gemfibrozil or by twice daily administration of less than 30 mg of gemfibrozil. Furthermore, CYP2C8 activity was decreased over 90% by a single dose of 900 mg gemfibrozil or twice daily dosing of approximately 100 mg gemfibrozil. In addition, with the application of physiological models to the data obtained in the dose-dependency studies, the major role of mechanism-based inhibition of CYP2C8 in the interaction between gemfibrozil and repaglinide was confirmed. The results of this work enhance the proper use of gemfibrozil and the safety of patients. The information related to time-dependency of CYP2C8 inhibition by gemfibrozil may also give new insights in order to improve the IVIVE of the drug-drug interactions of new chemical entities. The information obtained by this work may be utilised also in the design of clinical drug-drug interaction studies in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Social and material deprivation is associated with poor health, decreased subjective well-being, and limited opportunities for personal development. To date, little is known about the lived experiences of Finnish low-income youths and the general purpose of this study is to fill this gap. Despite the extensive research on socioeconomic income disparities, only a few scholars have addressed the question of how low socioeconomic position is experienced by disadvantaged people themselves. Little is known about the everyday social processes that lead to decreased well-being of economically and socially disadvantaged citizens. Data: The study is based on the data of 65 autobiographical essays written by Finnish low-income youths aged 14-29 (M=23.51, SD=3.95). The research data were originally collected in a Finnish nationwide writing contest “Arkipäivän kokemuksia köyhyydestä” [Everyday Experiences of Poverty] between June and September of 2006. The contest was partaken by 850 Finnish writers. Methods and key concepts: Autobiographical narratives (N=65) of low-income youths were analyzed based on grounded theory methodology (GTM). The analysis was not built on specific pre-conceived categorizations; it was guided by the paradigm model and so-called “sensitizing concepts”. The concepts this study utilized were based on the research literature on socioeconomic inequalities, resilience, and coping. Socioeconomic inequalities refer to unequal distribution of resources, such as income, social status, and health, between social groups. The concept of resilience refers to an individual’s capacity to cope despite existing risk factors and conditions that are harmful to health and well-being. Coping strategies can be understood as ways by which a person tries to cope with psychological stress in a situation where internal or externals demands exceed one’s resources. The ways to cope are cognitive or behavioral efforts by which individual tries to relieve the stress and gain new resources. Lack of material and social resources is associated with increased exposure to health-related stressors during the life-course. Aims: The first aim of this study is to illustrate how youths with low socioeconomic status perceive the causes and consequences of their social and material deprivation. The second aim is to describe what kind of coping strategies youths employ to cope in their everyday life. The third aim is to build an integrative conceptual framework based on the relationships between causes, consequences, and individual coping strategies associated with deprivation. The analysis was carried out through systematic coding and orderly treatment of the data based on the grounded theory methodology. Results: Finnish low-income youths attributed the primary causes of deprivation to their family background, current socioeconomic status, sudden life changes, and contextual factors. Material and social deprivation was associated with various kinds of negative psychological, social, and material consequences. Youths used a variety of coping strategies that were identified as psychological, social, material, and functional-behavioral. Finally, a conceptual framework was formulated to link the findings together. In the discussion, the results were compared and contrasted to the existing research literature. The main references of the study were: Coping: Aldwin (2007); Lazarus & Folkman (1984); Hobfoll (1989, 2001, 2002). Deprivation: Larivaara, Isola, & Mikkonen (2007); Lister (2004); Townsend (1987); Raphael (2007). Health inequalities: Dahlgren & Whitehead (2007); Lynch. et al. (2000); Marmot & Wilkinson (2006); WHO (2008). Methods: Charmaz (2006); Flick (2009); Strauss & Corbin (1990). Resilience: Cutuli & Masten (2009); Luthar (2006).