964 resultados para Probable Number Technique
Resumo:
Introduction: This article aims to show an alternative intervention for the prevention and control of back pain to the people of a production plant of geotextiles for the construction exposed to handling and awkward postures through the implementation of the Back School using the CORE technique. This technique being understood as trainer of the stability musculature of the spine; whose benefit is proportionate the muscular complex of the back, stability and avoid osteomuscular lesions and improved posture. Objective: To present the results about the implementation of the back school by the CORE technique for prevention of back pain in a population of forty-eight male collaborators. Materials and methods: The back school began with talks of awareness by the occupational health physician explaining the objectives and benefits of it to all participants. Once this activity was done, was continued to evaluate all plant employees to establish health status through the PAR-Q questionnaire, who were surveyed for the perception of pain using visual analog scale (VAS) and stability was determined column through the CORE assessment, to determine the training plan. Then, were made every six months the revaluations and implementation of a survey of assistant public perception to identify the impact of the implementation of the school back on the two variables referred (pain perception and stability of column). Results: The pain perception according VAS increased in the number of workers asymptomatic in 12% and based in the satisfaction survey 94% of population reported that with the development of this technique decrease the muscle fatigue in lumbar level; and 96% of population reported an improvement in the performance of their work activities. Discussion: Posterior to the analysis of all results, it is interpreted that back schools practice through CORE technique, contributes to the prevention and / or control of symptoms at the lumbar level in population of productive sector exposed to risks derived from the physical load, provided that ensure its continuously development and supervised for a competent professional.
Resumo:
A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.
Resumo:
A method is proposed to determine the extent of degradation in the rumen involving a two-stage mathematical modeling process. In the first stage, a statistical model shifts (or maps) the gas accumulation profile obtained using a fecal inoculum to a ruminal gas profile. Then, a kinetic model determines the extent of degradation in the rumen from the shifted profile. The kinetic model is presented as a generalized mathematical function, allowing any one of a number of alternative equation forms to be selected. This method might allow the gas production technique to become an approach for determining extent of degradation in the rumen, decreasing the need for surgically modified animals while still maintaining the link with the animal. Further research is needed before the proposed methodology can be used as a standard method across a range of feeds.
Resumo:
Very large scale scheduling and planning tasks cannot be effectively addressed by fully automated schedule optimisation systems, since many key factors which govern 'fitness' in such cases are unformalisable. This raises the question of an interactive (or collaborative) approach, where fitness is assigned by the expert user. Though well-researched in the domains of interactively evolved art and music, this method is as yet rarely used in logistics. This paper concerns a difficulty shared by all interactive evolutionary systems (IESs), but especially those used for logistics or design problems. The difficulty is that objective evaluation of IESs is severely hampered by the need for expert humans in the loop. This makes it effectively impossible to, for example, determine with statistical confidence any ranking among a decent number of configurations for the parameters and strategy choices. We make headway into this difficulty with an Automated Tester (AT) for such systems. The AT replaces the human in experiments, and has parameters controlling its decision-making accuracy (modelling human error) and a built-in notion of a target solution which may typically be at odds with the solution which is optimal in terms of formalisable fitness. Using the AT, plausible evaluations of alternative designs for the IES can be done, allowing for (and examining the effects of) different levels of user error. We describe such an AT for evaluating an IES for very large scale planning.
Resumo:
This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.
Resumo:
Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.
Resumo:
Public genealogical databases are becoming increasingly populated with historical data and records of the current population`s ancestors. As this increasing amount of available information is used to link individuals to their ancestors, the resulting trees become deeper and more dense, which justifies the need for using organized, space-efficient layouts to display the data. Existing layouts are often only able to show a small subset of the data at a time. As a result, it is easy to become lost when navigating through the data or to lose sight of the overall tree structure. On the contrary, leaving space for unknown ancestors allows one to better understand the tree`s structure, but leaving this space becomes expensive and allows fewer generations to be displayed at a time. In this work, we propose that the H-tree based layout be used in genealogical software to display ancestral trees. We will show that this layout presents an increase in the number of displayable generations, provides a nicely arranged, symmetrical, intuitive and organized fractal structure, increases the user`s ability to understand and navigate through the data, and accounts for the visualization requirements necessary for displaying such trees. Finally, user-study results indicate potential for user acceptance of the new layout.
Resumo:
A new approach based on microextraction by packed sorbent (MEPS) and reversed-phase high-throughput ultra high pressure liquid chromatography (UHPLC) method that uses a gradient elution and diode array detection to quantitate three biologically active flavonols in wines, myricetin, quercetin, and kaempferol, is described. In addition to performing routine experiments to establish the validity of the assay to internationally accepted criteria (selectivity, linearity, sensitivity, precision, accuracy), experiments are included to assess the effect of the important experimental parameters such as the type of sorbent material (C2, C8, C18, SIL, and C8/SCX), number of extraction cycles (extract-discard), elution volume, sample volume, and ethanol content, on the MEPS performance. The optimal conditions of MEPS extraction were obtained using C8 sorbent and small sample volumes (250 μL) in five extraction cycle and in a short time period (about 5 min for the entire sample preparation step). Under optimized conditions, excellent linearity View the MathML source(Rvalues2>0.9963), limits of detection of 0.006 μg mL−1 (quercetin) to 0.013 μg mL−1 (myricetin) and precision within 0.5–3.1% were observed for the target flavonols. The average recoveries of myricetin, quercetin and kaempferol for real samples were 83.0–97.7% with relative standard deviation (RSD, %) lower than 1.6%. The results obtained showed that the most abundant flavonol in the analyzed samples was myricetin (5.8 ± 3.7 μg mL−1). Quercetin (0.97 ± 0.41 μg mL−1) and kaempferol (0.66 ± 0.24 μg mL−1) were found in a lower concentration. The optimized MEPSC8 method was compared with a reverse-phase solid-phase extraction (SPE) procedure using as sorbent a macroporous copolymer made from a balanced ratio of two monomers, the lipophilic divinylbenzene and the hydrophilic N-vinylpyrrolidone (Oasis HLB) were used as reference. MEPSC8 approach offers an attractive alternative for analysis of flavonols in wines, providing a number of advantages including highest extraction efficiency (from 85.9 ± 0.9% to 92.1 ± 0.5%) in the shortest extraction time with low solvent consumption, fast sample throughput, more environmentally friendly and easy to perform.
Resumo:
A novel analytical approach, based on a miniaturized extraction technique, the microextraction by packed sorbent (MEPS), followed by ultrahigh pressure liquid chromatography (UHPLC) separation combined with a photodiode array (PDA) detection, has been developed and validated for the quantitative determination of sixteen biologically active phenolic constituents of wine. In addition to performing routine experiments to establish the validity of the assay to internationally accepted criteria (linearity, sensitivity, selectivity, precision, accuracy), experiments are included to assess the effect of the important experimental parameters on the MEPS performance such as the type of sorbent material (C2, C8, C18, SIL, and M1), number of extraction cycles (extract-discard), elution volume, sample volume, and ethanol content, were studied. The optimal conditions of MEPS extraction were obtained using C8 sorbent and small sample volumes (250 μL) in five extraction cycle and in a short time period (about 5 min for the entire sample preparation step). The wine bioactive phenolics were eluted by 250 μL of the mixture containing 95% methanol and 5% water, and the separation was carried out on a HSS T3 analytical column (100 mm × 2.1 mm, 1.8 μm particle size) using a binary mobile phase composed of aqueous 0.1% formic acid (eluent A) and methanol (eluent B) in the gradient elution mode (10 min of total analysis). The method gave satisfactory results in terms of linearity with r2-values > 0.9986 within the established concentration range. The LOD varied from 85 ng mL−1 (ferulic acid) to 0.32 μg mL−1 ((+)-catechin), whereas the LOQ values from 0.028 μg mL−1 (ferulic acid) to 1.08 μg mL−1 ((+)-catechin). Typical recoveries ranged between 81.1 and 99.6% for red wines and between 77.1 and 99.3% for white wines, with relative standard deviations (RSD) no larger than 10%. The extraction yields of the MEPSC8/UHPLC–PDA methodology were found between 78.1 (syringic acid) and 99.6% (o-coumaric acid) for red wines and between 76.2 and 99.1% for white wines. The inter-day precision, expressed as the relative standard deviation (RSD%), varied between 0.2% (p-coumaric and o-coumaric acids) and 7.5% (gentisic acid) while the intra-day precision between 0.2% (o-coumaric and cinnamic acids) and 4.7% (gallic acid and (−)-epicatechin). On the basis of analytical validation, it is shown that the MEPSC8/UHPLC–PDA methodology proves to be an improved, reliable, and ultra-fast approach for wine bioactive phenolics analysis, because of its capability for determining simultaneously in a single chromatographic run several bioactive metabolites with high sensitivity, selectivity and resolving power within only 10 min. Preliminary studies have been carried out on 34 real whole wine samples, in order to assess the performance of the described procedure. The new approach offers decreased sample preparation and analysis time, and moreover is cheaper, more environmentally friendly and easier to perform as compared to traditional methodologies.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Gelatin microparticles containing propolis extractive solution (PES) were prepared by spray-drying technique. The optimization of the spray-drying operating conditions and the proportions of gelatin and mannitol were investigated. Regular particle morphology was obtained when mannitol was used, whereas mannitol absence produced a substantial number of coalesced and agglomerated microparticles. Microparticles had a mean diameter of 2.70 mum without mannitol and 2.50 mum with mannitol. The entrapment efficiency for propolis of the microparticles was upto 41 % without mannitol and 39% with mannitol. The microencapsulation by spray-drying technique maintained the activity of propolis against Staphylococcus aureus. These gelatin microparticles containing propolis would be useful for developing intermediary or eventual propolis dosage form without the PES' strong and unpleasant taste, aromatic odour, and presence of ethanol. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Introduction: So far the only endovascular option to treat patients with thoraco abdominal aortic aneurysms is the deployment of branched grafts. We describe a technique consisting of the deployment of standard off-the- shelf grafts to treat urgent cases.Material and Methods: The sandwich technique consists of the deployment of ViaBahn chimney grafts in combination with standard thoracic and abdominal aortic stent grafts. The chimney grafts are deployed using a transbrachial and transaxillary access. These coaxial grafts are placed inside the thoracic tube graft. After deployment of the infrarenal bifurcated abdominal graft a bridging stent-a short tube graft is positioned inside the thoracic graft further stabilizing the chimney grafts.Results: 5 patients with symptomatic thoraco abdominal aneurysms were treated. There was one Type I endoleak that resolved after 2 months. In all patients 3 stentgrafts had to be used When possible all visceral and renal branches were revascularized. A total number of 17 arteries were reconnected with covered branches. During follow up we lost one target vessel the right renal artery.Conclusion: The sandwich technique in combination with chimney grafts permits a total endovascular exclusion of thoraco abdominal aortic aneurysms. In all cases off-the shelf products and grafts could be used. The number of patients treated so far is still too small to draw further more robust conclusions with regard to long term performance and durability. (C) 2010 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Resumo:
This clinical report describes a method to reduce the number of clinical sessions for the rehabilitation of implant-supported fixed dentures through a simplified and versatile procedure indicated mainly for immediate loading. According to this method, the immediate implant-supported fixed dentures for edentulous patients can be safely fabricated within 2 days. In this technique, the teeth in the wax are prepared on a base of light-polymerized resin, and both wax teeth and metallic superstructure trials are accomplished at the same session.