28 resultados para practical epistemology analysis

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the linear equality-constrained least squares problem (LSE) of minimizing ${\|c - Gx\|}_2 $, subject to the constraint $Ex = p$. A preconditioned conjugate gradient method is applied to the Kuhn–Tucker equations associated with the LSE problem. We show that our method is well suited for structural optimization problems in reliability analysis and optimal design. Numerical tests are performed on an Alliant FX/8 multiprocessor and a Cray-X-MP using some practical structural analysis data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Power delivery for biomedical implants is a major consideration in their design for both measurement and stimulation. When performed by a wireless technique, transmission efficiency is critically important not only because of the costs associated with any losses but also because of the nature of those losses, for example, excessive heat can be uncomfortable for the individual involved. In this study, a method and means of wireless power transmission suitable for biomedical implants are both discussed and experimentally evaluated. The procedure initiated is comparable in size and simplicity to those methods already employed; however, some of Tesla’s fundamental ideas have been incorporated in order to obtain a significant improvement in efficiency. This study contains a theoretical basis for the approach taken; however, the emphasis here is on practical experimental analysis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Power delivery for biomedical implants is a major consideration in their design for both measurement and stimulation. When performed by a wireless technique, transmission efficiency is critically important not only because of the costs associated with any losses but also because of the nature of those losses, for example, excessive heat can be uncomfortable for the individual involved. In this study, a method and means of wireless power transmission suitable for biomedical implants are both discussed and experimentally evaluated. The procedure initiated is comparable in size and simplicity to those methods already employed; however, some of Tesla’s fundamental ideas have been incorporated in order to obtain a significant improvement in efficiency. This study contains a theoretical basis for the approach taken; however, the emphasis here is on practical experimental analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard form contracts are typically developed through a negotiated consensus, unless they are proffered by one specific interest group. Previously published plans of work and other descriptions of the processes in construction projects tend to focus on operational issues, or they tend to be prepared from the point of view of one or other of the dominant interest groups. Legal practice in the UK permits those who draft contracts to define their terms as they choose. There are no definitive rulings from the courts that give an indication as to the detailed responsibilities of project participants. The science of terminology offers useful guidance for discovering and describing terms and their meanings in their practical context, but has never been used for defining terms for responsibilities of participants in the construction project management process. Organizational analysis enables the management task to be deconstructed into its elemental parts in order that effective organizational structures can be developed. Organizational mapping offers a useful technique for reducing text-based descriptions of project management roles and responsibilities to a comparable basis. Research was carried out by means of a desk study, detailed analysis of nine plans of work and focus groups representing all aspects of the construction industry. No published plan of work offers definitive guidance. There is an enormous amount of variety in the way that terms are used for identifying responsibilities of project participants. A catalogue of concepts and terms (a “Terminology”) has been compiled and indexed to enable those who draft contracts to choose the most appropriate titles for project participants. The purpose of this terminology is to enable the selection and justification of appropriate terms in order to help define roles. The terminology brings an unprecedented clarity to the description of roles and responsibilities in construction projects and, as such, will be helpful for anyone seeking to assemble a team and specify roles for project participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 'direct costs' attributable to 30 different endemic diseases of farm animals in Great Britain are estimated using a standardised method to construct a simple model for each disease that includes consideration of disease prevention and treatment costs. The models so far developed provide a basis for further analyses including cost-benefit analyses for the economic assessment of disease control options. The approach used reflects the inherent livestock disease information constraints, which limit the application of other economic analytical methods. It is a practical and transparent approach that is relatively easily communicated to veterinary scientists and policy makers. The next step is to develop the approach by incorporating wider economic considerations into the analyses in a way that will demonstrate to policy makers and others the importance of an economic perspective to livestock disease issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the ideal method of assessing the nutritive value of a feedstuff, namely offering it to the appropriate class of animal and recording the production response obtained, is neither practical nor cost effective a range of feed evaluation techniques have been developed. Each of these balances some degree of compromise with the practical situation against data generation. However, due to the impact of animal-feed interactions over and above that of feed composition, the target animal remains the ultimate arbitrator of nutritional value. In this review current in vitro feed evaluation techniques are examined according to the degree of animal-feed interaction. Chemical analysis provides absolute values and therefore differs from the majority of in vitro methods that simply rank feeds. However, with no host animal involvement, estimates of nutritional value are inferred by statistical association. In addition given the costs involved, the practical value of many analyses conducted should be reviewed. The in sacco technique has made a substantial contribution to both understanding rumen microbial degradative processes and the rapid evaluation of feeds, especially in developing countries. However, the numerous shortfalls of the technique, common to many in vitro methods, the desire to eliminate the use of surgically modified animals for routine feed evaluation, paralleled with improvements in in vitro techniques, will see this technique increasingly replaced. The majority of in vitro systems use substrate disappearance to assess degradation, however, this provides no information regarding the quantity of derived end-products available to the host animal. As measurement of volatile fatty acids or microbial biomass production greatly increases analytical costs, fermentation gas release, a simple and non-destructive measurement, has been used as an alternative. However, as gas release alone is of little use, gas-based systems, where both degradation and fermentation gas release are measured simultaneously, are attracting considerable interest. Alternative microbial inocula are being considered, as is the potential of using multi-enzyme systems to examine degradation dynamics. It is concluded that while chemical analysis will continue to form an indispensable part of feed evaluation, enhanced use will be made of increasingly complex in vitro systems. It is vital, however, the function and limitations of each methodology are fully understood and that the temptation to over-interpret the data is avoided so as to draw the appropriate conclusions. With careful selection and correct application in vitro systems offer powerful research tools with which to evaluate feedstuffs. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential methods provide a formal framework by which clinical trial data can be monitored as they accumulate. The results from interim analyses can be used either to modify the design of the remainder of the trial or to stop the trial as soon as sufficient evidence of either the presence or absence of a treatment effect is available. The circumstances under which the trial will be stopped with a claim of superiority for the experimental treatment, must, however, be determined in advance so as to control the overall type I error rate. One approach to calculating the stopping rule is the group-sequential method. A relatively recent alternative to group-sequential approaches is the adaptive design method. This latter approach provides considerable flexibility in changes to the design of a clinical trial at an interim point. However, a criticism is that the method by which evidence from different parts of the trial is combined means that a final comparison of treatments is not based on a sufficient statistic for the treatment difference, suggesting that the method may lack power. The aim of this paper is to compare two adaptive design approaches with the group-sequential approach. We first compare the form of the stopping boundaries obtained using the different methods. We then focus on a comparison of the power of the different trials when they are designed so as to be as similar as possible. We conclude that all methods acceptably control type I error rate and power when the sample size is modified based on a variance estimate, provided no interim analysis is so small that the asymptotic properties of the test statistic no longer hold. In the latter case, the group-sequential approach is to be preferred. Provided that asymptotic assumptions hold, the adaptive design approaches control the type I error rate even if the sample size is adjusted on the basis of an estimate of the treatment effect, showing that the adaptive designs allow more modifications than the group-sequential method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Population growth rate (PGR) is central to the theory of population ecology and is crucial for projecting population trends in conservation biology, pest management and wildlife harvesting. Furthermore, PGR is increasingly used to assess the effects of stressors. Image analysis that can automatically count and measure photographed individuals offers a potential methodology for estimating PGR. 2. This study evaluated two ways in which the PGR of Daphnia magna, exposed to different stressors, can be estimated using an image analysis system. The first method estimated PGR as the ratio of counts of individuals obtained at two different times, while the second method estimated PGR as the ratio of population sizes at two different times, where size is measured by the sum of the individuals' surface areas, i.e. total population surface area. This method is attractive if surface area is correlated with reproductive value (RV), as it is for D. magna, because of the theoretical result that PGR is the rate at which the population RV increases. 3. The image analysis system proved reliable and reproducible in counting populations of up to 440 individuals in 5 L of water. Image counts correlated well with manual counts but with a systematic underestimate of about 30%. This does not affect accuracy when estimating PGR as the ratio of two counts. Area estimates of PGR correlated well with count estimates, but were systematically higher, possibly reflecting their greater accuracy in the study situation. 4. Analysis of relevant scenarios suggested the correlation between RV and body size will generally be good for organisms in which fecundity correlates with body size. In these circumstances, area estimation of PGR is theoretically better than count estimation. 5. Synthesis and applications. There are both theoretical and practical advantages to area estimation of population growth rate when individuals' reproductive values are consistently well correlated with their surface areas. Because stressors may affect both the number and quality of individuals, area estimation of population growth rate should improve the accuracy of predicting stress impacts at the population level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the convergence behavior of the least mean square (LMS) filter when used in an adaptive code division multiple access (CDMA) detector consisting of a tapped delay line with adjustable tap weights. The sampling rate may be equal to or higher than the chip rate, and these correspond to chip-spaced (CS) and fractionally spaced (FS) detection, respectively. It is shown that CS and FS detectors with the same time-span exhibit identical convergence behavior if the baseband received signal is strictly bandlimited to half the chip rate. Even in the practical case when this condition is not met, deviations from this observation are imperceptible unless the initial tap-weight vector gives an extremely large mean squared error (MSE). This phenomenon is carefully explained with reference to the eigenvalues of the correlation matrix when the input signal is not perfectly bandlimited. The inadequacy of the eigenvalue spread of the tap-input correlation matrix as an indicator of the transient behavior and the influence of the initial tap weight vector on convergence speed are highlighted. Specifically, a initialization within the signal subspace or to the origin leads to very much faster convergence compared with initialization in the a noise subspace.