923 resultados para Direct method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The tremendous application potential of nanosized materials stays in sharp contrast to a growing number of critical reports of their potential toxicity. Applications of in vitro methods to assess nanoparticles are severely limited through difficulties in exposing cells of the respiratory tract directly to airborne engineered nanoparticles. We present a completely new approach to expose lung cells to particles generated in situ by flame spray synthesis. Cerium oxide nanoparticles from a single run were produced and simultaneously exposed to the surface of cultured lung cells inside a glovebox. Separately collected samples were used to measure hydrodynamic particle size distribution, shape, and agglomerate morphology. Cell viability was not impaired by the conditions of the glovebox exposure. The tightness of the lung cell monolayer, the mean total lamellar body volume, and the generation of oxidative DNA damage revealed a dose-dependent cellular response to the airborne engineered nanoparticles. The direct combination of production and exposure allows studying particle toxicity in a simple and reproducible way under environmental conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. METHODS: Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. RESULTS: The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. DISCUSSION: The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The boundary element method is specially well suited for the analysis of the seismic response of valleys of complicated topography and stratigraphy. In this paper the method’s capabilities are illustrated using as an example an irregularity stratified (test site) sedimentary basin that has been modelled using 2D discretization and the Direct Boundary Element Method (DBEM). Site models displaying different levels of complexity are used in practice. The multi-layered model’s seismic response shows generally good agreement with observed data amplification levels, fundamental frequencies and the high spatial variability. Still important features such as the location of high frequencies peaks are missing. Even 2D simplified models reveal important characteristics of the wave field that 1D modelling does not show up.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Direct Boundary Element Method (DBEM) is presented to solve the elastodynamic field equations in 2D, and a complete comprehensive implementation is given. The DBEM is a useful approach to obtain reliable numerical estimates of site effects on seismic ground motion due to irregular geological configurations, both of layering and topography. The method is based on the discretization of the classical Somigliana's elastodynamic representation equation which stems from the reciprocity theorem. This equation is given in terms of the Green's function which is the full-space harmonic steady-state fundamental solution. The formulation permits the treatment of viscoelastic media, therefore site models with intrinsic attenuation can be examined. By means of this approach, the calculation of 2D scattering of seismic waves, due to the incidence of P and SV waves on irregular topographical profiles is performed. Sites such as, canyons, mountains and valleys in irregular multilayered media are computed to test the technique. The obtained transfer functions show excellent agreement with already published results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Erosion potential and the effects of tillage can be evaluated from quantitative descriptions of soil surface roughness. The present study therefore aimed to fill the need for a reliable, low-cost and convenient method to measure that parameter. Based on the interpretation of micro-topographic shadows, this new procedure is primarily designed for use in the field after tillage. The principle underlying shadow analysis is the direct relationship between soil surface roughness and the shadows cast by soil structures under fixed sunlight conditions. The results obtained with this method were compared to the statistical indexes used to interpret field readings recorded by a pin meter. The tests were conducted on 4-m2 sandy loam and sandy clay loam plots divided into 1-m2 subplots tilled with three different tools: chisel, tiller and roller. The highly significant correlation between the statistical indexes and shadow analysis results obtained in the laboratory as well as in the field for all the soil?tool combinations proved that both variability (CV) and dispersion (SD) are accommodated by the new method. This procedure simplifies the interpretation of soil surface roughness and shortens the time involved in field operations by a factor ranging from 12 to 20.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have developed a technique for isolating DNA markers tightly linked to a target region that is based on RLGS, named RLGS spot-bombing (RLGS-SB). RLGS-SB allows us to scan the genome of higher organisms quickly and efficiently to identify loci that are linked to either a target region or gene of interest. The method was initially tested by analyzing a C57BL/6-GusS mouse congenic strain. We identified 33 variant markers out of 10,565 total loci in a 4.2-centimorgan (cM) interval surrounding the Gus locus in 4 days of laboratory work. The validity of RLGS-SB to find DNA markers linked to a target locus was also tested on pooled DNA from segregating backcross progeny by analyzing the spot intensity of already mapped RLGS loci. Finally, we used RLGS-SB to identify DNA markers closely linked to the mouse reeler (rl) locus on chromosome 5 by phenotypic pooling. A total of 31 RLGS loci were identified and mapped to the target region after screening 8856 loci. These 31 loci were mapped within 11.7 cM surrounding rl. The average density of RLGS loci located in the rl region was 0.38 cM. Three loci were closely linked to rl showing a recombination frequency of 0/340, which is < 1 cM from rl. Thus, RLGS-SB provides an efficient and rapid method for the detection and isolation of polymorphic DNA markers linked to a trait or gene of interest.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we assess the relative performance of the direct valuation method and industry multiplier models using 41 435 firm-quarter Value Line observations over an 11 year (1990–2000) period. Results from both pricingerror and return-prediction analyses indicate that direct valuation yields lower percentage pricing errors and greater return prediction ability than the forward price to aggregated forecasted earnings multiplier model. However, a simple hybrid combination of these two methods leads to more accurate intrinsic value estimates, compared to either method used in isolation. It would appear that fundamental analysis could benefit from using one approach as a check on the other.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study was to develop a multiplex loop-mediated isothermal amplification (LAMP) method capable of detecting Escherichia coli generally and verocytotoxigenic E. coli (VTEC) specifically in beef and bovine faeces. The LAMP assay developed was highly specific (100%) and able to distinguish between E. coli and VTEC based on the amplification of the phoA, and stx1 and/or stx2 genes, respectively. In the absence of an enrichment step, the limit of detection 50% (LOD50) of the LAMP assay was determined to be 2.83, 3.17 and 2.83-3.17 log CFU/g for E. coli with phoA, stx1 and stx2 genes, respectively, when artificially inoculated minced beef and bovine faeces were tested. The LAMP calibration curves generated with pure cultures, and spiked beef and faeces, suggested that the assay had good quantification capability. Validation of the assay, performed using retail beef and bovine faeces samples, demonstrated good correlation between counts obtained by the LAMP assay and by a conventional culture method, but suggested the possibility of false negative LAMP results for 12.5-14.7% of samples tested. The multiplex LAMP assay developed potentially represents a rapid alternative to culture for monitoring E.coli levels in beef or faeces and it would provide additional information on the presence of VTEC. However, some further optimisation is needed to improve detection sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inverse dynamics is the most comprehensive method that gives access to the net joint forces and moments during walking. However it is based on assumptions (i.e., rigid segments linked by ideal joints) and it is known to be sensitive to the input data (e.g., kinematic derivatives, positions of joint centres and centre of pressure, inertial parameters). Alternatively, transducers can be used to measure directly the load applied on the residuum of transfemoral amputees. So, the purpose of this study was to compare the forces and moments applied on a prosthetic knee measured directly with the ones calculated by three inverse dynamics computations - corresponding to 3 and 2 segments, and « ground reaction vector technique » - during the gait of one patient. The maximum RMSEs between the estimated and directly measured forces (i.e., 56 N) and moment (i.e., 5 N.m) were relatively small. However the dynamic outcomes of the prosthetic components (i.e., absorption of the foot, friction and limit stop of the knee) were only partially assessed with inverse dynamic methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lithium niobate powders from the raw powders of Li2 O5 are directly synthesized by a combustion method with urea fuel. The synthesis parameters (e.g. the calcination temperature, calcination time, and urea-to-(Li2 CO3 + Nb2 O5) quantity ratio) are studied to reveal the optimized synthesis conditions for preparing high-quality lithium niobate powders. In our present work, it is found that a urea-to-(Li2 CO3 + Nb2 O5) ratio close to 3, calcination temperature at 550-600 degrees and reaction time around 2.5h may lead to high-quality lithium niobate powsers. The microstructure of synthesized powders is further studied; a possible mechanism of the involved reactions is also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the importance of literature studies in the IS discipline is well recognized, little attention has been paid to the underlying structure and method of conducting effective literature reviews. Despite the fact that literature is often used to refine the research context and direct the pathways for successful research outcomes, there is very little evidence of the use of resource management tools to support the literature review process. In this paper we want to contribute to advancing the way in which literature studies in Information Systems are conducted, by proposing a systematic, pre-defined and tool-supported method to extract, analyse and report literature. This paper presents how to best identify relevant IS papers to review within a feasible and justifiable scope, how to extract relevant content from identified papers, how to synthesise and analyse the findings of a literature review and what are ways to effectively write and present the results of a literature review. The paper is specifically targeted towards novice IS researchers, who would seek to conduct a systematic detailed literature review in a focused domain. Specific contributions of our method are extensive tool support, the identification of appropriate papers including primary and secondary paper sets and a pre-codification scheme. We use a literature study on shared services as an illustrative example to present the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to predict the operating conditions of machine based on classification and regression trees (CART) and adaptive neuro-fuzzy inference system (ANFIS) in association with direct prediction strategy for multi-step ahead prediction of time series techniques. In this study, the number of available observations and the number of predicted steps are initially determined by using false nearest neighbor method and auto mutual information technique, respectively. These values are subsequently utilized as inputs for prediction models to forecast the future values of the machines’ operating conditions. The performance of the proposed approach is then evaluated by using real trending data of low methane compressor. A comparative study of the predicted results obtained from CART and ANFIS models is also carried out to appraise the prediction capability of these models. The results show that the ANFIS prediction model can track the change in machine conditions and has the potential for using as a tool to machine fault prognosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of layered double hydroxides (LDHs) based composites were synthesized by using induced hydrolysis silylation method (IHS), surfactant precursor method, in-situ coprecipitation method, and direct silylation method. Their structures, morphologies, bonding modes and thermal stabilities can be readily adjusted by changing the parameters during preparation and drying processing of the LDHs. The characterization results show that the direct silylation reaction cannot occur between the dried LDHs and 3-aminopropyltriethoxysilane (APS) in an ethanol medium. However, the condensation reaction can proceed with heating process between adsorbed APS and LDHs plates. While using wet state substrates with and without surfactant and ethanol as the solvent, the silylation process can be induced by hydrolysis of APS on the surface of LDHs plates. Surfactants improve the hydrophobicity of the LDHs during the process of nucleation and crystallization, resulting in fluffy shaped crystals; meanwhile, they occupy the surface –OH positions and leave less “free –OH” available for the silylation reaction, favoring formation of silylated products with a higher population in the hydrolyzed bidentate (T2) and tridentate (T3) bonding forms. These bonding characteristics lead to spherical aggregates and tightly bonded particles. All silylated products show higher thermal stability than those of pristine LDHs.