16 resultados para One-point Quadrature

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis examines and explains the development of occupational exposure limits (OELs) as a means of preventing work related disease and ill health. The research focuses on the USA and UK and sets the work within a certain historical and social context. A subsidiary aim of the thesis is to identify any short comings in OELs and the methods by which they are set and suggest alternatives. The research framework uses Thomas Kuhn's idea of science progressing by means of paradigms which he describes at one point, `lq ... universally recognised scientific achievements that for a time provide model problems and solutions to a community of practitioners. KUHN (1970). Once learned individuals in the community, `lq ... are committed to the same rules and standards for scientific practice. Ibid. Kuhn's ideas are adapted by combining them with a view of industrial hygiene as an applied science-based profession having many of the qualities of non-scientific professions. The great advantage of this approach to OELs is that it keeps the analysis grounded in the behaviour and priorities of the groups which have forged, propounded, used, benefited from, and defended, them. The development and use of OELs on a larger scale is shown to be connected to the growth of a new profession in the USA; industrial hygiene, with the assistance of another new profession; industrial toxicology. The origins of these professions, particularly industrial hygiene, are traced. By examining the growth of the professions and the writings of key individuals it is possible to show how technical, economic and social factors became embedded in the OEL paradigm which industrial hygienists and toxicologists forged. The origin, mission and needs of these professions and their clients made such influences almost inevitable. The use of the OEL paradigm in practice is examined by an analysis of the process of the American Conference of Governmental Industrial Hygienists, Threshold Limit Value (ACGIH, TLV) Committee via the Minutes from 1962-1984. A similar approach is taken with the development of OELs in the UK. Although the form and definition of TLVs has encouraged the belief that they are health-based OELs the conclusion is that they, and most other OELs, are, and always have been, reasonably practicable limits: the degree of risk posed by a substance is weighed against the feasibility and cost of controlling exposure to that substance. The confusion over the status of TLVs and other OELs is seen to be a confusion at the heart of the OEL paradigm and the historical perspective explains why this should be. The paradigm has prevented the creation of truly health-based and, conversely, truly reasonably practicable OELs. In the final part of the thesis the analysis of the development of OELs is set in a contemporary context and a proposal for a two-stage, two-committee procedure for producing sets of OELs is put forward. This approach is set within an alternative OEL paradigm. The advantages, benefits and likely obstacles to these proposals are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study examines the job satisfaction of supervisors and managers in four organisations over time. It also considers the importance which they attached to different facets of their job. The major objectives were: To examine the constituent dimensions of job satisfaction at intervals over one year. To examine reasons for change inthe level of job satisfaction at intervals over one year. To provide information on job satisfaction for those concerned with job satisfaction policies. The sample consisted of one hundred and eight people. Each was interviewed on at least three occasions over the course of a year. Interviews took place at predetermined time intervals. The study shows that job satisfaction is dynamic over a relatively short period of time. The ratings which supervisors and managers gave to aspects of their job did not, however, all change by equal amounts or in the same direction. Changes in job satisfaction were associated with events experienced but it was the meaning of those events to correspondents which appeared to be particularly important. People tended to adopt a localised frame of reference when considering their work situation. Certain job variables, such as variety, were consistently and positively correlated with job satisfaction. With some other variables, the relationship varied across time. Frequently, age and job level moderated the association between independent variables and job satisfaction. Links were found between the quality of life and job satisfaction. There was a consistent positive association between job satisfaction and life satisfaction. However, the job was rarely considered to be the main factor contributing to a person's quality of life. The research highlights the difficulties and desirability of introducing standardised job satisfaction policies in the light of individual differences. In addition, it demonstrates that merely correlating variables with job satisfaction at one point in time may conceal complex relationships and meanings. A new measure of job satisfaction - whereby facets are assessed and rated relative to each other was also developed as part of this study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper extends existing understandings of how actors' constructions of ambiguity shape the emergent process of strategic action. We theoretically elaborate the role of rhetoric in exploiting strategic ambiguity, based on analysis of a longitudinal case study of an internationalization strategy within a business school. Our data show that actors use rhetoric to construct three types of strategic ambiguity: protective ambiguity that appeals to common values in order to protect particular interests, invitational ambiguity that appeals to common values in order to invite participation in particular actions, and adaptive ambiguity that enables the temporary adoption of specific values in order to appeal to a particular audience at one point in time. These rhetorical constructions of ambiguity follow a processual pattern that shapes the emergent process of strategic action. Our findings show that (1) the strategic actions that emerge are shaped by the way actors construct and exploit ambiguity, (2) the ambiguity intrinsic to the action is analytically distinct from ambiguity that is constructed and exploited by actors, and (3) ambiguity construction shifts over time to accommodate the emerging pattern of actions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: This study aims to build on recent research, by investigating and examining how likely it is that Chinese locals (i.e. host country nationals (HCNs)) would offer support to expatriates from India and the USA. Design/methodology/approach: Data were gathered from 222 participants in Chinese organizations, asking them to respond to questions about their willingness to offer support to expatriates. Findings: As predicted, perceived values similarity was significantly related to higher dogmatism, which had a significant positive relationship with ethnocentrism. Further, ethnocentrism had a significant negative relationship with willingness to offer support. Research limitations/implications: All data were collected from the participants at one point in time, so the study's results are subject to common method bias. Also, it only included India and the USA, as the two countries of origin of the expatriates. Practical implications: Given HCNs do not automatically offer support to all expatriates, organizations might consider sending expatriates who are culturally similar to HCNs, as they are more likely to receive support, which will help their adjustment and thus organizational effectiveness. Originality/value: This study adds to the small, but growing, number of empirical investigations of HCN willingness to support expatriates. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exercises involving the calculation of the derivative of piecewise defined functions are common in calculus, with the aim of consolidating beginners’ knowledge of applying the definition of the derivative. In such exercises, the piecewise function is commonly made up of two smooth pieces joined together at one point. A strategy which avoids using the definition of the derivative is to find the derivative function of each smooth piece and check whether these functions agree at the chosen point. Showing that this strategy works together with investigating discontinuities of the derivative is usually beyond a calculus course. However, we shall show that elementary arguments can be used to clarify the calculation and behaviour of the derivative for piecewise functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

‘Not belonging’ is becoming a prevalent theme within accounts of the first-year student experience at university. In this study the notion of not belonging is extended by assuming a more active role for the idea of liminality in a student’s transition into the university environments of academic and student life. In doing so, the article suggests that the transition between one place (home) and another (university) can result in an ‘in-between-ness’ – a betwixt space. Through an interpretative methodology, the study explores how students begin to move from this betwixt space into feeling like fully-fledged members of university life. It is concluded that there is a wide range of turning points associated with the students’ betwixt transition, which shapes, alters or indeed accentuates the ways in which they make meaningful connections with university life. Moreover, transitional turning point experiences reveal a cast of characters and symbolic objects; capture contrasting motivations and evolving relationships; display multiple trajectories of interpersonal tensions and conflicts; highlight discontinuities as well as continuities; and together, simultaneously liberate and constrain the students’ transition into university life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effectiveness of rapid and controlled heating of intact tissue to inactivate native enzymatic activity and prevent proteome degradation has been evaluated. Mouse brains were bisected immediately following excision, with one hemisphere being heat treated followed by snap freezing in liquid nitrogen while the other hemisphere was snap frozen immediately. Sections were cut by cryostatic microtome and analyzed by MALDI-MS imaging and minimal label 2-D DIGE, to monitor time-dependent relative changes in intensities of protein and peptide signals. Analysis by MALDI-MS imaging demonstrated that the relative intensities of markers varied across a time course (0-5 min) when the tissues were not stabilized by heat treatment. However, the same markers were seen to be stabilized when the tissues were heat treated before snap freezing. Intensity profiles for proteins indicative of both degradation and stabilization were generated when samples of treated and nontreated tissues were analyzed by 2-D DIGE, with protein extracted before and after a 10-min warming of samples. Thus, heat treatment of tissues at the time of excision is shown to prevent subsequent uncontrolled degradation of tissues at the proteomic level before any quantitative analysis, and to be compatible with downstream proteomic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents an abbreviated version of the second part of the report on problems of Europe, prepared by a team of teachers at the University of Information Technology and Management in Rzeszow, Poland. We stress therein that the hotly debated problems of the Eurozone and the global financial crisis and its aftermath are, at best, medium-term ones, while real issues Europe faces are of the long-term nature and result from policies pursued for decades. Their consequences are also long-term – and increasingly harmful. Our diagnosis is as follows. Long-term problems related to the increasing burden of the welfare state and its side effects, like the slowing economic growth rate, are not subject to serious policy debates. It applies to both traditional elites from parties belonging to the moderate political spectrum, and to anti-elites on both extremes. Both elites and anti-elites reject the reality as a starting point to developing corrective policy measures. Our economic analysis has revealed that incentives to create wealth in old Western countries have been weakening for a long time. Yet, without deep cuts in public (especially welfare) expenditures and accompanying institutional reforms, economic performance of European (and generally Western) economies is going to worsen over time. The chances of continued stagnation in the next 5–10 years are very high. Finally, we look at the socio-psychological behavioral framework of the ever-expanding welfare state. We point at the phenomenon of the learned helplessness which appears as a result of the people’s lacking perception of linkages between their actions and economic results of these actions. We interpret it as a consequence of the welfare state. It further weakens the prospects for successful reforms and the resultant avoidance of the long-term stagnation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend a meshless method of fundamental solutions recently proposed by the authors for the one-dimensional two-phase inverse linear Stefan problem, to the nonlinear case. In this latter situation the free surface is also considered unknown which is more realistic from the practical point of view. Building on the earlier work, the solution is approximated in each phase by a linear combination of fundamental solutions to the heat equation. The implementation and analysis are more complicated in the present situation since one needs to deal with a nonlinear minimization problem to identify the free surface. Furthermore, the inverse problem is ill-posed since small errors in the input measured data can cause large deviations in the desired solution. Therefore, regularization needs to be incorporated in the objective function which is minimized in order to obtain a stable solution. Numerical results are presented and discussed. © 2014 IMACS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The purpose of this study was to investigate the 12-month outcome of macular edema secondary to both chronic and new central and branch retinal vein occlusions treated with intravitreal bevacizumab in the real-life clinical setting in the UK. Methods: Retrospective case notes analysis of consecutive patients with retinal vein occlusions treated with bevacizumab in 2010 to 2012. Outcome measures were visual acuity (measured with Snellen, converted into logMAR [logarithm of the minimum angle of resolution] for statistical calculation) and central retinal thickness at baseline, 4 weeks post-loading phase, and at 1 year. Results: There were 56 and 100 patients with central and branch retinal vein occlusions, respectively, of whom 62% had chronic edema and received prior therapies and another 32% required additional laser treatments post-baseline bevacizumab. Baseline median visual acuity was 0.78 (interquartile range [IQR] 0.48–1.22) in the central group and 0.6 (IQR 0.3–0.78) in the branch group. In both groups, visual improvement was statistically significant from baseline compared to post-loading (P,0.001 and P=0.03, respectively), but was not significant by month 12 (P=0.058 and P=0.166, respectively); 30% improved by at least three lines and 44% improved by at least one line by month 12. Baseline median central retinal thickness was 449 μm (IQR 388–553) in the central group and 441 µm (IQR 357–501) in the branch group. However, the mean reduction in thickness was statistically significant at post-loading (P,0.001) and at the 12-month time point (P,0.001) for both groups. The average number of injections in 1 year was 4.2 in the central group and 3.3 in the branch group. Conclusion: Our large real-world cohort results indicate that bevacizumab introduced to patients with either new or chronic edema due to retinal vein occlusion can result in resolution of edema and stabilization of vision in the first year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long reach-passive optical networks (LR-PON) are being proposed as a means of enabling ubiquitous fiber-to-the-home (FTTH) by massive sharing of network resources and therefore reducing per customer costs to affordable levels. In this paper, we analyze the chain solutions for LR-PON deployment in urban and rural areas at 100-Gb/s point-to-point transmission using dual polarization-quaternary phase shift-keying (DP-QPSK) modulation. The numerical analysis shows that with appropriate finite impulse response (FIR) filter designs, 100-Gb/s transmission can be achieved with at least 512 way split and up to 160 km total distance, which is sufficient for many of the optical paths in a practical situation, for point-to-point link from one LR-PON to another LR-PON through the optical switch at the metro nodes and across a core light path through the core network without regeneration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peroxiredoxin-2 (PRDX-2) belongs to a family of thiol containing proteins and is important for antioxidant defense, redox signaling and cell function. This study examined whether lymphocyte PRDX-2 levels are altered over one month following ultra-endurance exercise. Nine middle-aged men participated in a 145 mile ultra-endurance running race event. Blood drawing was undertaken immediately before, upon completion/retirement, and at one, seven and twenty eight-days following the race. PRDX-2 levels were examined at each time-point, for all participants (n=9) by reducing SDS-PAGE and western blotting. Further analysis using non-reducing SDS-PAGE and western blotting was undertaken in a sub-group of men who completed the race (n = 4) to investigate PRDX-2 oligomeric state (indicative of oxidation state). Ultra-endurance exercise caused a significant alteration in lymphocyte PRDX-2 levels (F(4,32) 3.409, p=0.020, η2 =0.299): seven-days after the race PRDX-2 levels fell by 70% (p=0.013) and at twenty eight-days after the race returned to near-normal levels. PRDX-2 dimers (intracellular reduced PRDX-2 monomers) in three of the four participants, who finished the race, were increased upon race completion. Furthermore, PRDX-2 monomers (intracellular over-oxidized PRDX-2 monomers) in two of these four participants were present upon race completion, but absent seven-days after the race. This study found that PRDX-2 levels in lymphocytes were reduced below normal levels seven-days after an ultra-endurance running race. We suggest that excessive reactive oxygen species production, induced by ultra-endurance exercise may, in part, explain the depletion of lymphocyte PRDX-2 by triggering its turnover after oxidation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: In any macromolecular polyprotic system - for example protein, DNA or RNA - the isoelectric point - commonly referred to as the pI - can be defined as the point of singularity in a titration curve, corresponding to the solution pH value at which the net overall surface charge - and thus the electrophoretic mobility - of the ampholyte sums to zero. Different modern analytical biochemistry and proteomics methods depend on the isoelectric point as a principal feature for protein and peptide characterization. Protein separation by isoelectric point is a critical part of 2-D gel electrophoresis, a key precursor of proteomics, where discrete spots can be digested in-gel, and proteins subsequently identified by analytical mass spectrometry. Peptide fractionation according to their pI is also widely used in current proteomics sample preparation procedures previous to the LC-MS/MS analysis. Therefore accurate theoretical prediction of pI would expedite such analysis. While such pI calculation is widely used, it remains largely untested, motivating our efforts to benchmark pI prediction methods. Results: Using data from the database PIP-DB and one publically available dataset as our reference gold standard, we have undertaken the benchmarking of pI calculation methods. We find that methods vary in their accuracy and are highly sensitive to the choice of basis set. The machine-learning algorithms, especially the SVM-based algorithm, showed a superior performance when studying peptide mixtures. In general, learning-based pI prediction methods (such as Cofactor, SVM and Branca) require a large training dataset and their resulting performance will strongly depend of the quality of that data. In contrast with Iterative methods, machine-learning algorithms have the advantage of being able to add new features to improve the accuracy of prediction. Contact: yperez@ebi.ac.uk Availability and Implementation: The software and data are freely available at https://github.com/ypriverol/pIR. Supplementary information: Supplementary data are available at Bioinformatics online.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a review of the latest developments in one-dimensional (1D) optical wave turbulence (OWT). Based on an original experimental setup that allows for the implementation of 1D OWT, we are able to show that an inverse cascade occurs through the spontaneous evolution of the nonlinear field up to the point when modulational instability leads to soliton formation. After solitons are formed, further interaction of the solitons among themselves and with incoherent waves leads to a final condensate state dominated by a single strong soliton. Motivated by the observations, we develop a theoretical description, showing that the inverse cascade develops through six-wave interaction, and that this is the basic mechanism of nonlinear wave coupling for 1D OWT. We describe theory, numerics and experimental observations while trying to incorporate all the different aspects into a consistent context. The experimental system is described by two coupled nonlinear equations, which we explore within two wave limits allowing for the expression of the evolution of the complex amplitude in a single dynamical equation. The long-wave limit corresponds to waves with wave numbers smaller than the electrical coherence length of the liquid crystal, and the opposite limit, when wave numbers are larger. We show that both of these systems are of a dual cascade type, analogous to two-dimensional (2D) turbulence, which can be described by wave turbulence (WT) theory, and conclude that the cascades are induced by a six-wave resonant interaction process. WT theory predicts several stationary solutions (non-equilibrium and thermodynamic) to both the long- and short-wave systems, and we investigate the necessary conditions required for their realization. Interestingly, the long-wave system is close to the integrable 1D nonlinear Schrödinger equation (NLSE) (which contains exact nonlinear soliton solutions), and as a result during the inverse cascade, nonlinearity of the system at low wave numbers becomes strong. Subsequently, due to the focusing nature of the nonlinearity, this leads to modulational instability (MI) of the condensate and the formation of solitons. Finally, with the aid of the probability density function (PDF) description of WT theory, we explain the coexistence and mutual interactions between solitons and the weakly nonlinear random wave background in the form of a wave turbulence life cycle (WTLC).