33 resultados para Operation based method

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The estimation of muscle forces in musculoskeletal shoulder models is still controversial. Two different methods are widely used to solve the indeterminacy of the system: electromyography (EMG)-based methods and stress-based methods. The goal of this work was to evaluate the influence of these two methods on the prediction of muscle forces, glenohumeral load and joint stability after total shoulder arthroplasty. An EMG-based and a stress-based method were implemented into the same musculoskeletal shoulder model. The model replicated the glenohumeral joint after total shoulder arthroplasty. It contained the scapula, the humerus, the joint prosthesis, the rotator cuff muscles supraspinatus, subscapularis and infraspinatus and the middle, anterior and posterior deltoid muscles. A movement of abduction was simulated in the plane of the scapula. The EMG-based method replicated muscular activity of experimentally measured EMG. The stress-based method minimised a cost function based on muscle stresses. We compared muscle forces, joint reaction force, articular contact pressure and translation of the humeral head. The stress-based method predicted a lower force of the rotator cuff muscles. This was partly counter-balanced by a higher force of the middle part of the deltoid muscle. As a consequence, the stress-based method predicted a lower joint load (16% reduced) and a higher superior-inferior translation of the humeral head (increased by 1.2 mm). The EMG-based method has the advantage of replicating the observed cocontraction of stabilising muscles of the rotator cuff. This method is, however, limited to available EMG measurements. The stress-based method has thus an advantage of flexibility, but may overestimate glenohumeral subluxation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. The investigation of exactly how much benefit can be brought by geophysical data in terms of its effect on hydrological predictions, however, has received considerably less attention in the literature. Here, we examine the potential hydrological benefits brought by a recently introduced simulated annealing (SA) conditional stochastic simulation method designed for the assimilation of diverse hydrogeophysical data sets. We consider the specific case of integrating crosshole ground-penetrating radar (GPR) and borehole porosity log data to characterize the porosity distribution in saturated heterogeneous aquifers. In many cases, porosity is linked to hydraulic conductivity and thus to flow and transport behavior. To perform our evaluation, we first generate a number of synthetic porosity fields exhibiting varying degrees of spatial continuity and structural complexity. Next, we simulate the collection of crosshole GPR data between several boreholes in these fields, and the collection of porosity log data at the borehole locations. The inverted GPR data, together with the porosity logs, are then used to reconstruct the porosity field using the SA-based method, along with a number of other more elementary approaches. Assuming that the grid-cell-scale relationship between porosity and hydraulic conductivity is unique and known, the porosity realizations are then used in groundwater flow and contaminant transport simulations to assess the benefits and limitations of the different approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Commentaire de: Gaziano TA, Young CR, Fitzmaurice G, Atwood S, Gaziano JM. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort. Lancet. 2008;371(9616):923-31. PMID: 18342687

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The high complexity of cortical convolutions in humans is very challenging both for engineers to measure and compare it, and for biologists and physicians to understand it. In this paper, we propose a surface-based method for the quantification of cortical gyrification. Our method uses accurate 3-D cortical reconstruction and computes local measurements of gyrification at thousands of points over the whole cortical surface. The potential of our method to identify and localize precisely gyral abnormalities is illustrated by a clinical study on a group of children affected by 22q11 Deletion Syndrome, compared to control individuals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Cytomegalovirus (CMV) infection is associated with significant morbidity and mortality in transplant recipients. Resistance against ganciclovir is increasingly observed. According to current guidelines, direct drug resistance testing is not always performed due to high costs and work effort, even when resistance is suspected. OBJECTIVES: To develop a more sensitive, easy applicable and cost-effective assay as proof of concept for direct drug resistance testing in CMV surveillance of post-transplant patients. STUDY DESIGN: Five consecutive plasma samples from a heart transplant patient with a primary CMV infection were analyzed by quantitative real-time polymerase chain reaction (rtPCR) as a surrogate marker for therapy failure, and by direct drug resistance detection assays such as Sanger sequencing and the novel primer extension (PEX) reaction matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF) based method. RESULTS: This report demonstrates that PEX reaction followed by MALDI-TOF analysis detects the A594V mutation, encoding ganciclovir resistance, ten days earlier compared to Sanger sequencing and more than 30 days prior to an increase in viral load. CONCLUSION: The greatly increased sensitivity and rapid turnaround-time combined with easy handling and moderate costs indicate that this procedure could make a major contribution to improve transplantation outcomes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The function of DNA-binding proteins is controlled not just by their abundance, but mainly at the level of their activity in terms of their interactions with DNA and protein targets. Moreover, the affinity of such transcription factors to their target sequences is often controlled by co-factors and/or modifications that are not easily assessed from biological samples. Here, we describe a scalable method for monitoring protein-DNA interactions on a microarray surface. This approach was designed to determine the DNA-binding activity of proteins in crude cell extracts, complementing conventional expression profiling arrays. Enzymatic labeling of DNA enables direct normalization of the protein binding to the microarray, allowing the estimation of relative binding affinities. Using DNA sequences covering a range of affinities, we show that the new microarray-based method yields binding strength estimates similar to low-throughput gel mobility-shift assays. The microarray is also of high sensitivity, as it allows the detection of a rare DNA-binding protein from breast cancer cells, the human tumor suppressor AP-2. This approach thus mediates precise and robust assessment of the activity of DNA-binding proteins and takes present DNA-binding assays to a high throughput level.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Health assessment and medical surveillance of workers exposed to combustion nanoparticles are challenging. The aim was to evaluate the feasibility of using exhaled breath condensate (EBC) from healthy volunteers for (1) assessing the lung deposited dose of combustion nanoparticles and (2) determining the resulting oxidative stress by measuring hydrogen peroxide (H2O2) and malondialdehyde (MDA). Methods: Fifteen healthy nonsmoker volunteers were exposed to three different levels of sidestream cigarette smoke under controlled conditions. EBC was repeatedly collected before, during, and 1 and 2 hr after exposure. Exposure variables were measured by direct reading instruments and by active sampling. The different EBC samples were analyzed for particle number concentration (light-scattering-based method) and for selected compounds considered oxidative stress markers. Results: Subjects were exposed to an average airborne concentration up to 4.3×10(5) particles/cm(3) (average geometric size ∼60-80 nm). Up to 10×10(8) particles/mL could be measured in the collected EBC with a broad size distribution (50(th) percentile ∼160 nm), but these biological concentrations were not related to the exposure level of cigarette smoke particles. Although H2O2 and MDA concentrations in EBC increased during exposure, only H2O2 showed a transient normalization 1 hr after exposure and increased afterward. In contrast, MDA levels stayed elevated during the 2 hr post exposure. Conclusions: The use of diffusion light scattering for particle counting proved to be sufficiently sensitive to detect objects in EBC, but lacked the specificity for carbonaceous tobacco smoke particles. Our results suggest two phases of oxidation markers in EBC: first, the initial deposition of particles and gases in the lung lining liquid, and later the start of oxidative stress with associated cell membrane damage. Future studies should extend the follow-up time and should remove gases or particles from the air to allow differentiation between the different sources of H2O2 and MDA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Difficult tracheal intubation assessment is an important research topic in anesthesia as failed intubations are important causes of mortality in anesthetic practice. The modified Mallampati score is widely used, alone or in conjunction with other criteria, to predict the difficulty of intubation. This work presents an automatic method to assess the modified Mallampati score from an image of a patient with the mouth wide open. For this purpose we propose an active appearance models (AAM) based method and use linear support vector machines (SVM) to select a subset of relevant features obtained using the AAM. This feature selection step proves to be essential as it improves drastically the performance of classification, which is obtained using SVM with RBF kernel and majority voting. We test our method on images of 100 patients undergoing elective surgery and achieve 97.9% accuracy in the leave-one-out crossvalidation test and provide a key element to an automatic difficult intubation assessment system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

State-of-the-art production technologies for conjugate vaccines are complex, multi-step processes. An alternative approach to produce glycoconjugates is based on the bacterial N-linked protein glycosylation system first described in Campylobacter jejuni. The C. jejuni N-glycosylation system has been successfully transferred into Escherichia coli, enabling in vivo production of customized recombinant glycoproteins. However, some antigenic bacterial cell surface polysaccharides, like the Vi antigen of Salmonella enterica serovar Typhi, have not been reported to be accessible to the bacterial oligosaccharyltransferase PglB, hence hamper development of novel conjugate vaccines against typhoid fever. In this report, Vi-like polysaccharide structures that can be transferred by PglB were evaluated as typhoid vaccine components. A polysaccharide fulfilling these requirements was found in Escherichia coli serovar O121. Inactivation of the E. coli O121 O antigen cluster encoded gene wbqG resulted in expression of O polysaccharides reactive with antibodies raised against the Vi antigen. The structure of the recombinantly expressed mutant O polysaccharide was elucidated using a novel HPLC and mass spectrometry based method for purified undecaprenyl pyrophosphate (Und-PP) linked glycans, and the presence of epitopes also found in the Vi antigen was confirmed. The mutant O antigen structure was transferred to acceptor proteins using the bacterial N-glycosylation system, and immunogenicity of the resulting conjugates was evaluated in mice. The conjugate-induced antibodies reacted in an enzyme-linked immunosorbent assay with E. coli O121 LPS. One animal developed a significant rise in serum immunoglobulin anti-Vi titer upon immunization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Medical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from reference-dependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from cross-section data containing information on the means of financing health payments. We estimate medical expenditure risk in seven Asian countries and find it is highest in Laos and China, and is lowest in Malaysia. Exposure to risk is generally higher for households that have less recourse to self-insurance, lower incomes, wealth and education, and suffer from chronic illness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Measuring syringe availability and coverage is essential in the assessment of HIV/AIDS risk reduction policies. Estimates of syringe availability and coverage were produced for the years 1996 and 2006, based on all relevant available national-level aggregated data from published sources. METHODS: We defined availability as the total monthly number of syringes provided by harm reduction system divided by the estimated number of injecting drug users (IDU), and defined coverage as the proportion of injections performed with a new syringe, at national level (total supply over total demand). Estimates of supply of syringes were derived from the national monitoring system, including needle and syringe programmes (NSP), pharmacies, and medically prescribed heroin programmes. Estimates of syringe demand were based on the number of injections performed by IDU derived from surveys of low threshold facilities for drug users (LTF) with NSP combined with the number of IDU. This number was estimated by two methods combining estimates of heroin users (multiple estimation method) and (a) the number of IDU in methadone treatment (MT) (non-injectors) or (b) the proportion of injectors amongst LTF attendees. Central estimates and ranges were obtained for availability and coverage. RESULTS: The estimated number of IDU decreased markedly according to both methods. The MT-based method (from 14,818 to 4809) showed a much greater decrease and smaller size of the IDU population compared to the LTF-based method (from 24,510 to 12,320). Availability and coverage estimates are higher with the MT-based method. For 1996, central estimates of syringe availability were 30.5 and 18.4 per IDU per month; for 2006, they were 76.5 and 29.9. There were 4 central estimates of coverage. For 1996 they ranged from 24.3% to 43.3%, and for 2006, from 50.5% to 134.3%. CONCLUSION: Although 2006 estimates overlap 1996 estimates, the results suggest a shift to improved syringe availability and coverage over time.