899 resultados para design methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

To validate a model for investigating the effects of analgesic drugs on mechanical, thermal and electrical stimulation testing. To investigate repeatability, sensitivity and specificity of nociceptive tests. Randomised experiment with 2 observers in 2 phases. Mechanical (M), thermal (TL) and electrical (E) stimuli were applied to the dorsal metacarpus (M-left and TL-right) and coronary band of the left thoracic limb (E) and a thoracic thermal stimulus (TT) was applied caudal to the withers in 8 horses (405 ± 43 kg). Stimuli intensities were increased until a clear avoidance response was detected without exceeding 20 N (M), 60°C (TL and TT) and 15 V (E). For each set of tests, 3 real stimuli and one sham stimulus were applied (32 per animal) using a blinded, randomised, crossover design repeated after 6 months. A distribution frequency and, for each stimulus, Chi-square and McNemar tests compared both the proportion of positive responses detected by 2 observers and the 2 study phases. The κ coefficients estimated interobserver agreement in determining endpoints. Sensitivity (384 tests) and specificity (128 tests) were evaluated for each nociceptive stimulus to assess the evaluators' accuracy in detecting real and sham stimuli. Nociceptive thresholds were 3.1 ± 2 N (M), 8.1 ± 3.8 V (E), 51.4 ± 5.5°C (TL) and 55.2 ± 5.3°C (TT). The level of agreement after all tests, M, E, TL and TT, was 90, 100, 84, 98 and 75%, respectively. Sensitivity was 89, 100, 89, 98 and 70% and specificity 92, 97, 88, 91 and 94%, respectively. The high interobserver agreement, sensitivity and specificity suggest that M, E and TL tests are valid for pain studies in horses and are suitable tools for investigating antinociceptive effects of analgesics in horses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model for the joint economic design of X̄ and R control charts is developed. This model assumes that the process is subject to two assignable causes. One assignable cause shifts the process mean; the other shifts the process variance. The occurrence of the assignable cause of one kind does not block the occurrence of the assignable cause of another kind. Consequently, a second process parameter can go out-of-control after the first process parameter has gone out-of-control. A numerical study of the cost surface to the model considered has revealed that it is convex, at least in the interest region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wood is a renewable material and has unique characteristics that stem from its orthotropic properties. The objective of this study was to develop and implement an informational Selection of Materials and Manufacturing Processes appropriate to the activity of Product Design - Woods. Composed of a Digital Information System, distributed, and an ordered collection of samples. The design of a product, carries with it the choice of material, and the choice of a manufacturing process. Information on materials and manufacturing processes are available with different content, media and interfaces. However, such information is not systematized so that they can be recovered, as the need for the designer, especially on wood. This set of methods is called the Selection of Materials and Manufacturing Processes. It is hoped through this study that the methodologies for SMPF are employed by product designers, architects and engineers in Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper summarizes a research that has been developed from the concern about a company’s strategic planning, once regarding the product development process and its methodological guidelines. Our goal was to check if there is an ideal methodology, peculiar to the graphic design, which could be applied to a printed newspaper condition. And then, we could deduce, according to the design practices in newspapers, its relevance. We did not intend to create a specific graphic design methodology for daily newspapers, nor the analysis of methods, but we wanted to emphasize that the familiarization with acknowledged methods in the field of visual communication, during the process of professional formation, might ease good choices in the work practice on design. The understanding of the gradual introduction of graphic design on printed daily newspapers brings about, thus, not only their visual improvement, but also allows that this kind of journalism (which depends on a graphic interface in order to make their product come true) reconsider the newspaper as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rationalism appears as a philosophical current whose mental process and logic and evidenced, with the main characteristic of modern thought the method, the Bauhaus represents for the design the way the industry undressed the ornaments seeking the ideal of form and function. In a current contemporary design the emotional recovery in addition to the functions that an object can have, aesthetic, practical and symbolic, also the emotional identification, making review the methods. This article makes a relation of this context with the brilliantly that the couple Charles and Ray Eames, have created a method quite emotional for the creation of design, starting with the invention of the plywood and then with access to various technologies resulting from post-war, they created a very interesting process of building design combining art technique, producing seats that are ageless, found easily in the contemporary world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During autumn 2003, several thousand European starlings (Sturnus vulgaris) began roosting on exposed I-beams in a newly constructed, decorative glass canopy that covered the passenger pick-up area at the terminal building for Cleveland Hopkins International Airport, Ohio. The use of lethal control or conventional dispersal techniques, such as pyrotechnics and fire hoses, were not feasible in the airport terminal area. The design and aesthetics of the structure precluded the use of netting and other exclusion materials. In January 2004, an attempt was made to disperse the birds using recorded predator and distress calls broadcast from speakers installed in the structure. This technique failed to disperse the birds. In February 2004, we developed a technique using compressed air to physically and audibly harass the birds. We used a trailer-mounted commercial air compressor producing 185 cubic feet per minute of air at 100 pounds per square inch pressure and a 20-foot long, 1-inch diameter PVC pipe attached to the outlet hose. One person slowly (< 5 mph) drove a pick-up truck through the airport terminal at dusk while the second person sat on a bench in the truck bed and directed the compressed air from the pipe into the canopy to harass starlings attempting to enter the roost site. After 5 consecutive nights of compressed-air harassment, virtually no starlings attempted to roost in the canopy. Once familiar with the physical effects of the compressed air, the birds dispersed at the sound of the air. Only occasional harassment at dusk was needed through the remainder of the winter to keep the canopy free of starlings. Similar harassment with the compressor was conducted successfully in autumn 2004 with the addition of a modified leaf blower, wooden clappers, and laser. In conclusion, we found compressed air to be a safe, unobtrusive, and effective method for dispersing starlings from an urban roost site. This technique would likely be applicable for other urban-roosting species such as crows, house sparrows, and blackbirds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This mixed methods concurrent triangulation design study was predicated upon two models that advocated a connection between teaching presence and perceived learning: the Community of Inquiry Model of Online Learning developed by Garrison, Anderson, and Archer (2000); and the Online Interaction Learning Model by Benbunan-Fich, Hiltz, and Harasim (2005). The objective was to learn how teaching presence impacted students’ perceptions of learning and sense of community in intensive online distance education courses developed and taught by instructors at a regional comprehensive university. In the quantitative phase online surveys collected relevant data from participating students (N = 397) and selected instructional faculty (N = 32) during the second week of a three-week Winter Term. Student information included: demographics such as age, gender, employment status, and distance from campus; perceptions of teaching presence; sense of community; perceived learning; course length; and course type. The students claimed having positive relationships between teaching presence, perceived learning, and sense of community. The instructors showed similar positive relationships with no significant differences when the student and instructor data were compared. The qualitative phase consisted of interviews with 12 instructors who had completed the online survey and replied to all of the open-response questions. The two phases were integrated using a matrix generation, and the analysis allowed for conclusions regarding teaching presence, perceived learning, and sense of community. The findings were equivocal with regard to satisfaction with course length and the relative importance of the teaching presence components. A model was provided depicting relationships between and among teaching presence components, perceived learning, and sense of community in intensive online courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major depressive disorder (MDD) trials - investigating either non-pharmacological or pharmacological interventions - have shown mixed results. Many reasons explain this heterogeneity, but one that stands out is the trial design due to specific challenges in the field. We aimed therefore to review the methodology of non-invasive brain stimulation (NIBS) trials and provide a framework to improve clinical trial design. We performed a systematic review for randomized, controlled MDD trials whose intervention was transcranial magnetic stimulation (rTMS) or transcranial direct current stimulation (tDCS) in MEDLINE and other databases from April 2002 to April 2008. We created an unstructured checklist based on CONSORT guidelines to extract items such as power analysis, sham method, blinding assessment, allocation concealment, operational criteria used for MDD, definition of refractory depression and primary study hypotheses. Thirty-one studies were included. We found that the main methodological issues can be divided in to three groups: (1) issues related to phase II/small trials, (2) issues related to MDD trials and, (3) specific issues of NIBS studies. Taken together, they can threaten study validity and lead to inconclusive results. Feasible solutions include: estimating the sample size a priori; measuring the degree of refractoriness of the subjects; specifying the primary hypothesis and statistical tests; controlling predictor variables through stratification randomization methods or using strict eligibility criteria; adjusting the study design to the target population; using adaptive designs and exploring NIBS efficacy employing biological markers. In conclusion, our study summarizes the main methodological issues of NIBS trials and proposes a number of alternatives to manage them. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.