940 resultados para rule-based
Resumo:
STUDY DESIGN: The biomechanics of vertebral bodies augmented with real distributions of cement were investigated using nonlinear finite element (FE) analysis. OBJECTIVES: To compare stiffness, strength, and stress transfer of augmented versus nonaugmented osteoporotic vertebral bodies under compressive loading. Specifically, to examine how cement distribution, volume, and compliance affect these biomechanical variables. SUMMARY OF BACKGROUND DATA: Previous FE studies suggested that vertebroplasty might alter vertebral stress transfer, leading to adjacent vertebral failure. However, no FE study so far accounted for real cement distributions and bone damage accumulation. METHODS: Twelve vertebral bodies scanned with high-resolution pQCT and tested in compression were augmented with various volumes of cements and scanned again. Nonaugmented and augmented pQCT datasets were converted to FE models, with bone properties modeled with an elastic, plastic and damage constitutive law that was previously calibrated for the nonaugmented models. The cement-bone composite was modeled with a rule of mixture. The nonaugmented and augmented FE models were subjected to compression and their stiffness, strength, and stress map calculated for different cement compliances. RESULTS: Cement distribution dominated the stiffening and strengthening effects of augmentation. Models with cement connecting either the superior or inferior endplate (S/I fillings) were only up to 2 times stiffer than the nonaugmented models with minimal strengthening, whereas those with cement connecting both endplates (S + I fillings) were 1 to 8 times stiffer and 1 to 12 times stronger. Stress increases above and below the cement, which was higher for the S + I cases and was significantly reduced by increasing cement compliance. CONCLUSION: The developed FE approach, which accounts for real cement distributions and bone damage accumulation, provides a refined insight into the mechanics of augmented vertebral bodies. In particular, augmentation with compliant cement bridging both endplates would reduce stress transfer while providing sufficient strengthening.
Resumo:
In this paper, we are concerned about the short-term scheduling of industrial make-and-pack production processes. The planning problem consists in minimizing the production makespan while meeting given end-product demands. Sequence-dependent changeover times, multi-purpose storage units with finite capacities, quarantine times, batch splitting, partial equipment connectivity, material transfer times, and a large number of operations contribute to the complexity of the problem. Known MILP formulations cover all technological constraints of such production processes, but only small problem instances can be solved in reasonable CPU times. In this paper, we develop a heuristic in order to tackle large instances. Under this heuristic, groups of batches are scheduled iteratively using a novel MILP formulation; the assignment of the batches to the groups and the scheduling sequence of the groups are determined using a priority rule. We demonstrate the applicability by means of a real-world production process.
Resumo:
AIMS A non-invasive gene-expression profiling (GEP) test for rejection surveillance of heart transplant recipients originated in the USA. A European-based study, Cardiac Allograft Rejection Gene Expression Observational II Study (CARGO II), was conducted to further clinically validate the GEP test performance. METHODS AND RESULTS Blood samples for GEP testing (AlloMap(®), CareDx, Brisbane, CA, USA) were collected during post-transplant surveillance. The reference standard for rejection status was based on histopathology grading of tissue from endomyocardial biopsy. The area under the receiver operating characteristic curve (AUC-ROC), negative (NPVs), and positive predictive values (PPVs) for the GEP scores (range 0-39) were computed. Considering the GEP score of 34 as a cut-off (>6 months post-transplantation), 95.5% (381/399) of GEP tests were true negatives, 4.5% (18/399) were false negatives, 10.2% (6/59) were true positives, and 89.8% (53/59) were false positives. Based on 938 paired biopsies, the GEP test score AUC-ROC for distinguishing ≥3A rejection was 0.70 and 0.69 for ≥2-6 and >6 months post-transplantation, respectively. Depending on the chosen threshold score, the NPV and PPV range from 98.1 to 100% and 2.0 to 4.7%, respectively. CONCLUSION For ≥2-6 and >6 months post-transplantation, CARGO II GEP score performance (AUC-ROC = 0.70 and 0.69) is similar to the CARGO study results (AUC-ROC = 0.71 and 0.67). The low prevalence of ACR contributes to the high NPV and limited PPV of GEP testing. The choice of threshold score for practical use of GEP testing should consider overall clinical assessment of the patient's baseline risk for rejection.
Resumo:
OBJECTIVES To investigate the frequency of interim analyses, stopping rules, and data safety and monitoring boards (DSMBs) in protocols of randomized controlled trials (RCTs); to examine these features across different reasons for trial discontinuation; and to identify discrepancies in reporting between protocols and publications. STUDY DESIGN AND SETTING We used data from a cohort of RCT protocols approved between 2000 and 2003 by six research ethics committees in Switzerland, Germany, and Canada. RESULTS Of 894 RCT protocols, 289 prespecified interim analyses (32.3%), 153 stopping rules (17.1%), and 257 DSMBs (28.7%). Overall, 249 of 894 RCTs (27.9%) were prematurely discontinued; mostly due to reasons such as poor recruitment, administrative reasons, or unexpected harm. Forty-six of 249 RCTs (18.4%) were discontinued due to early benefit or futility; of those, 37 (80.4%) were stopped outside a formal interim analysis or stopping rule. Of 515 published RCTs, there were discrepancies between protocols and publications for interim analyses (21.1%), stopping rules (14.4%), and DSMBs (19.6%). CONCLUSION Two-thirds of RCT protocols did not consider interim analyses, stopping rules, or DSMBs. Most RCTs discontinued for early benefit or futility were stopped without a prespecified mechanism. When assessing trial manuscripts, journals should require access to the protocol.
Resumo:
The main objective of this study was to determine the external validity of a clinical prediction rule developed by the European Multicenter Study on Human Spinal Cord Injury (EM-SCI) to predict the ambulation outcomes 12 months after traumatic spinal cord injury. Data from the North American Clinical Trials Network (NACTN) data registry with approximately 500 SCI cases were used for this validity study. The predictive accuracy of the EM-SCI prognostic model was evaluated using calibration and discrimination based on 231 NACTN cases. The area under the receiver-operating-characteristics curve (ROC) curve was 0.927 (95% CI 0.894 – 0.959) for the EM-SCI model when applied to NACTN population. This is lower than the AUC of 0.956 (95% CI 0.936 – 0.976) reported for the EM-SCI population, but suggests that the EM-SCI clinical prediction rule distinguished well between those patients in the NACTN population who were able to achieve independent ambulation and those who did not achieve independent ambulation. The calibration curve suggests that higher the prediction score is, the better the probability of walking with the best prediction for AIS D patients. In conclusion, the EM-SCI clinical prediction rule was determined to be generalizable to the adult NACTN SCI population.^
Resumo:
We show a procedure for constructing a probabilistic atlas based on affine moment descriptors. It uses a normalization procedure over the labeled atlas. The proposed linear registration is defined by closed-form expressions involving only geometric moments. This procedure applies both to atlas construction as atlas-based segmentation. We model the likelihood term for each voxel and each label using parametric or nonparametric distributions and the prior term is determined by applying the vote-rule. The probabilistic atlas is built with the variability of our linear registration. We have two segmentation strategy: a) it applies the proposed affine registration to bring the target image into the coordinate frame of the atlas or b) the probabilistic atlas is non-rigidly aligning with the target image, where the probabilistic atlas is previously aligned to the target image with our affine registration. Finally, we adopt a graph cut - Bayesian framework for implementing the atlas-based segmentation.
Resumo:
Conventional dual-rail precharge logic suffers from difficult implementations of dual-rail structure for obtaining strict compensation between the counterpart rails. As a light-weight and high-speed dual-rail style, balanced cell-based dual-rail logic (BCDL) uses synchronised compound gates with global precharge signal to provide high resistance against differential power or electromagnetic analyses. BCDL can be realised from generic field programmable gate array (FPGA) design flows with constraints. However, routings still exist as concerns because of the deficient flexibility on routing control, which unfavourably results in bias between complementary nets in security-sensitive parts. In this article, based on a routing repair technique, novel verifications towards routing effect are presented. An 8 bit simplified advanced encryption processing (AES)-co-processor is executed that is constructed on block random access memory (RAM)-based BCDL in Xilinx Virtex-5 FPGAs. Since imbalanced routing are major defects in BCDL, the authors can rule out other influences and fairly quantify the security variants. A series of asymptotic correlation electromagnetic (EM) analyses are launched towards a group of circuits with consecutive routing schemes to be able to verify routing impact on side channel analyses. After repairing the non-identical routings, Mutual information analyses are executed to further validate the concrete security increase obtained from identical routing pairs in BCDL.
Resumo:
The class I myosins play important roles in controlling many different types of actin-based cell movements. Dictyostelium cells either lacking or overexpressing amoeboid myosin Is have significant defects in cortical activities such as pseudopod extension, cell migration, and macropinocytosis. The existence of Dictyostelium null mutants with strong phenotypic defects permits complementation analysis as a means of exploring important functional features of the myosin I heavy chain. Mutant Dictyostelium cells lacking two myosin Is exhibit profound defects in growth, endocytosis, and rearrangement of F-actin. Expression of the full-length myoB heavy chain in these cells fully rescues the double mutant defects. However, mutant forms of the myoB heavy chain in which a serine at the consensus phosphorylation site has been altered to an alanine or in which the C-terminal SH3 domain has been removed fail to complement the null phenotype. The wild-type and mutant forms of the myoB heavy chain appeared to be properly localized when they were expressed in the myosin I null mutants. These results suggest that the amoeboid myosin I consensus phosphorylation site and SH3 domains do not play a role in the localization of myosin I, but are absolutely required for in vivo function.
Resumo:
Tranformed-rule up and down psychophysical methods have gained great popularity, mainly because they combine criterion-free responses with an adaptive procedure allowing rapid determination of an average stimulus threshold at various criterion levels of correct responses. The statistical theory underlying the methods now in routine use is based on sets of consecutive responses with assumed constant probabilities of occurrence. The response rules requiring consecutive responses prevent the possibility of using the most desirable response criterion, that of 75% correct responses. The earliest transformed-rule up and down method, whose rules included nonconsecutive responses, did not contain this limitation but failed to become generally accepted, lacking a published theoretical foundation. Such a foundation is provided in this article and is validated empirically with the help of experiments on human subjects and a computer simulation. In addition to allowing the criterion of 75% correct responses, the method is more efficient than the methods excluding nonconsecutive responses in their rules.
Resumo:
The rule that eukaryotic ribosomes initiate translation exclusively at the 5' proximal AUG codon is abrogated under rare conditions. One circumstance that has been suggested to allow dual initiation is close apposition of a second AUG codon. A possible mechanism might be that the scanning 40S ribosomal subunit flutters back and forth instead of stopping cleanly at the first AUG. This hypothesis seems to be ruled out by evidence presented herein that in certain mRNAs, the first of two close AUG codons is recognized uniquely. To achieve this, the 5' proximal AUG has to be provided with the full consensus sequence; even small departures allow a second nearby AUG codon to be reached by leaky scanning. This context-dependent leaky scanning unexpectedly fails when the second AUG codon is moved some distance from the first. A likely explanation, based on analyzing the accessibility of a far-downstream AUG codon under conditions of initiation versus elongation, is that 80S elongating ribosomes advancing from the 5' proximal start site can mask potential downstream start sites.
Resumo:
This paper proposes a new feature representation method based on the construction of a Confidence Matrix (CM). This representation consists of posterior probability values provided by several weak classifiers, each one trained and used in different sets of features from the original sample. The CM allows the final classifier to abstract itself from discovering underlying groups of features. In this work the CM is applied to isolated character image recognition, for which several set of features can be extracted from each sample. Experimentation has shown that the use of CM permits a significant improvement in accuracy in most cases, while the others remain the same. The results were obtained after experimenting with four well-known corpora, using evolved meta-classifiers with the k-Nearest Neighbor rule as a weak classifier and by applying statistical significance tests.
Resumo:
This study examines the protection of fundamental rights, democracy and rule of law in the European Union, and the challenges that arise in reflecting on ways to strengthen EU competences in these contested terrains. It provides a ‘state of play’ and critical account of EU-level policy and legal mechanisms assessing the relationship between rule of law, democracy and fundamental rights in the member states of the Union. The cross-cutting challenges affecting their uses, effective implementation and practical operability constitute a central point of the analysis. The study argues that the relationship between rule of law, democracy and fundamental rights is co-constitutive. Any future rule of law-related policy discussion in the EU should start from an understanding of the triangular relationship between these dimensions from the perspective of ‘democratic rule of law with fundamental rights’, i.e. the legally based rule of a democratic state that delivers fundamental rights. The three criteria are inherently and indivisibly interconnected, and interdependent on each of the others, and they cannot be separated without inflicting profound damage to the whole and changing its essential shape and configuration.
Resumo:
When they look at Internet policy, EU policymakers seem mesmerised, if not bewitched, by the word ‘neutrality’. Originally confined to the infrastructure layer, today the neutrality rhetoric is being expanded to multi-sided platforms such as search engines and more generally online intermediaries. Policies for search neutrality and platform neutrality are invoked to pursue a variety of policy objectives, encompassing competition, consumer protection, privacy and media pluralism. This paper analyses this emerging debate and comes to a number of conclusions. First, mandating net neutrality at the infrastructure layer might have some merit, but it certainly would not make the Internet neutral. Second, since most of the objectives initially associated with network neutrality cannot be realistically achieved by such a rule, the case for network neutrality legislation would have to stand on different grounds. Third, the fact that the Internet is not neutral is mostly a good thing for end users, who benefit from intermediaries that provide them with a selection of the over-abundant information available on the Web. Fourth, search neutrality and platform neutrality are fundamentally flawed principles that contradict the economics of the Internet. Fifth, neutrality is a very poor and ineffective recipe for media pluralism, and as such should not be invoked as the basis of future media policy. All these conclusions have important consequences for the debate on the future EU policy for the Digital Single Market.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The effect of an organically surface modified layered silicate on the viscosity of various epoxy resins of different structures and different functionalities was investigated. Steady and dynamic shear viscosities of the epoxy resins containing 0-10 wt% of the organoclay were determined using parallel plate rheology. Viscosity results were compared with those achieved through addition of a commonly used micron-sized CaCO3 filler. It was found that changes in viscosities due to the different fillers were of the same order, since the layered silicate was only dispersed on a micron-sized scale in the monomer (prior to reaction), as indicated by X-ray diffraction measurements. Flow activation energies at a low frequency were determined and did not show any significant changes due to the addition of organoclay or CaCO3. Comparison between dynamic and steady shear experiments showed good agreement for low layered silicate concentrations below 7.5 wt%, i.e. the Cox-Merz rule can be applied. Deviations from the Cox-Merz rule appeared at and above 10 wt%, although such deviations were only slightly above experimental error. Most resin organoclay blends were well predicted by the Power Law model, only concentrations of 10 wt% and above requiring the Herschel-Buckley (yield stress) model to achieve better fits. Wide-angle X-ray measurements have shown that the epoxy resin swells the layered silicate with an increase in the interlayer distance of approximately 15 Angstrom, and that the rheology behavior is due to the lateral, micron-size of these swollen tactoids.