928 resultados para law of large numbers


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent and application of high-resolution array-based comparative genome hybridization (array CGH) has led to the detection of large numbers of copy number variants (CNVs) in patients with developmental delay and/or multiple congenital anomalies as well as in healthy individuals. The notion that CNVs are also abundantly present in the normal population challenges the interpretation of the clinical significance of detected CNVs in patients. In this review we will illustrate a general clinical workflow based on our own experience that can be used in routine diagnostics for the interpretation of CNVs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The symptomatic phases of many inflammatory diseases are characterized by migration of large numbers of neutrophils (PMN) across a polarized epithelium and accumulation within a lumen. For example, acute PMN influx is common in diseases of the gastrointestinal system (ulcerative colitis, Crohn's disease, bacterial enterocolitis, gastritis), hepatobiliary system (cholangitis, acute cholecystitis), respiratory tract (bronchial pneumonia, bronchitis, cystic fibrosis, bronchiectasis), and urinary tract (pyelonephritis, cystitis). Despite these observations, the molecular basis of leukocyte interactions with epithelial cells is incompletely understood. In vitro models of PMN transepithelial migration typically use N-formylated bacterial peptides such as fMLP in isolation to drive human PMNs across epithelial monolayers. However, other microbial products such as lipopolysaccharide (LPS) are major constituents of the intestinal lumen and have potent effects on the immune system. In the absence of LPS, we have shown that transepithelial migration requires sequential adhesive interactions between the PMN beta2 integrin CD11b/CD18 and JAM protein family members. Other epithelial ligands appear to be abundantly represented as fucosylated proteoglycans. Further studies indicate that the rate of PMN migration across mucosal surfaces can be regulated by the ubiquitously expressed transmembrane protein CD47 and microbial-derived factors, although many of the details remain unclear. Current data suggests that Toll-like receptors (TLR), which recognize specific pathogen-associated molecular patterns (PAMPs), are differentially expressed on both leukocytes and mucosal epithelial cells while serving to modulate leukocyte-epithelial interactions. Exposure of epithelial TLRs to microbial ligands has been shown to result in transcriptional upregulation of inflammatory mediators whereas ligation of leukocyte TLRs modulate specific antimicrobial responses. A better understanding of these events will hopefully provide new insights into the mechanisms of epithelial responses to microorganisms and ideas for therapies aimed at inhibiting the deleterious consequences of mucosal inflammation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving the binding affinity and/or stability of peptide ligands often requires testing of large numbers of variants to identify beneficial mutations. Herein we propose a type of mutation that promises a high success rate. In a bicyclic peptide inhibitor of the cancer-related protease urokinase-type plasminogen activator (uPA), we observed a glycine residue that has a positive ϕ dihedral angle when bound to the target. We hypothesized that replacing it with a D-amino acid, which favors positive ϕ angles, could enhance the binding affinity and/or proteolytic resistance. Mutation of this specific glycine to D-serine in the bicyclic peptide indeed improved inhibitory activity (1.75-fold) and stability (fourfold). X-ray-structure analysis of the inhibitors in complex with uPA showed that the peptide backbone conformation was conserved. Analysis of known cyclic peptide ligands showed that glycine is one of the most frequent amino acids, and that glycines with positive ϕ angles are found in many protein-bound peptides. These results suggest that the glycine-to-D-amino acid mutagenesis strategy could be broadly applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Prospective assessment of pedicled extrathoracic muscle flaps for the closure of large intrathoracic airway defects after noncircumferential resection in situations where an end-to-end reconstruction seemed risky (defects of > 4-cm length, desmoplastic reactions after previous infection or radiochemotherapy). METHODS: From 1996 to 2001, 13 intrathoracic muscle transpositions (6 latissimus dorsi and 7 serratus anterior muscle flaps) were performed to close defects of the intrathoracic airways after noncircumferential resection for tumor (n = 5), large tracheoesophageal fistula (n = 2), delayed tracheal injury (n = 1) and bronchopleural fistula (n = 5). In 2 patients, the extent of the tracheal defect required reinforcement of the reconstruction by use of a rib segment embedded into the muscle flap followed by temporary tracheal stenting. Patient follow-up was by clinical examination bronchoscopy and biopsy, pulmonary function tests, and dynamic virtual bronchoscopy by computed tomographic (CT) scan during inspiration and expiration. RESULTS: The airway defects ranged from 2 x 1 cm to 8 x 4 cm and involved up to 50% of the airway circumference. They were all successfully closed using muscle flaps with no mortality and all patients were extubated within 24 hours. Bronchoscopy revealed epithelialization of the reconstructions without dehiscence, stenosis, or recurrence of fistulas. The flow-volume loop was preserved in all patients and dynamic virtual bronchoscopy revealed no significant difference in the endoluminal cross surface areas of the airway between inspiration and expiration above (45 +/- 21 mm(2)), at the site (76 +/- 23 mm(2)) and below the reconstruction (65 +/- 40 mm(2)). CONCLUSIONS: Intrathoracic airway defects of up to 50% of the circumference may be repaired using extrathoracic muscle flaps when an end-to-end reconstruction is not feasible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teaching and research are organised differently between subject domains: attempts to construct typologies of higher education institutions, however, often do not include quantitative indicators concerning subject mix which would allow systematic comparisons of large numbers of higher education institutions among different countries, as the availability of data for such indicators is limited. In this paper, we present an exploratory approach for the construction of such indicators. The database constructed in the AQUAMETH project, which includes also data disaggregated at the disciplinary level, is explored with the aim of understanding patterns of subject mix. For six European countries, an exploratory and descriptive analysis of staff composition divided in four large domains (medical sciences, engineering and technology, natural sciences and social sciences and humanities) is performed, which leads to a classification distinguishing between specialist and generalist institutions. Among the latter, a further distinction is made based on the presence or absence of a medical department. Preliminary exploration of this classification and its comparison with other indicators show the influence of long term dynamics on the subject mix of individual higher education institutions, but also underline disciplinary differences, for example regarding student to staff ratios, as well as national patterns, for example regarding the number of PhD degrees per 100 undergraduate students. Despite its many limitations, this exploratory approach allows defining a classification of higher education institutions that accounts for a large share of differences between the analysed higher education institutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Initial topography and inherited structural discontinuities are known to play a dominant role in rock slope stability. Previous 2-D physical modeling results demonstrated that even if few preexisting fractures are activated/propagated during gravitational failure all of those heterogeneities had a great influence on mobilized volume and its kinematics. The question we address in the present study is to determine if such a result is also observed in 3-D. As in 2-D previous models we examine geologically stable model configuration, based upon the well documented landslide at Randa, Switzerland. The 3-D models consisted of a homogeneous material in which several fracture zones were introduced in order to study simplified but realistic configurations of discontinuities (e.g. based on natural example rather than a parametric study). Results showed that the type of gravitational failure (deep-seated landslide or sequential failure) and resulting slope morphology evolution are the result of the interplay of initial topography and inherited preexisting fractures (orientation and density). The three main results are i) the initial topography exerts a strong control on gravitational slope failure. Indeed in each tested configuration (even in the isotropic one without fractures) the model is affected by a rock slide, ii) the number of simulated fracture sets greatly influences the volume mobilized and its kinematics, and iii) the failure zone involved in the 1991 event is smaller than the results produced by the analog modeling. This failure may indicate that the zone mobilized in 1991 is potentially only a part of a larger deep-seated landslide and/or wider deep seated gravitational slope deformation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A generic LC-MS approach for the absolute quantification of undigested peptides in plasma at mid-picomolar levels is described. Nine human peptides namely, brain natriuretic peptide (BNP), substance P (SubP), parathyroid hormone 1-34 (PTH), C-peptide, orexines A and B (Orex-A and -B), oxytocin (Oxy), gonadoliberin-1 (gonadothropin releasing-hormone or luteinizing hormone-releasing hormone, LHRH) and α-melanotropin (α-MSH) were targeted. Plasma samples were extracted via a 2-step procedure: protein precipitation using 1vol of acetonitrile followed by ultrafiltration of supernatants on membranes with a MW cut-off of 30 kDa. By applying a specific LC-MS setup, large volumes of filtrates (e.g., 2×750 μL) were injected and the peptides were trapped on a 1mm i.d.×10 mm length C8 column using a 10× on-line dilution. Then, the peptides were back-flushed and a second on-line dilution (2×) was applied during the transfer step. The refocalized peptides were resolved on a 0.3mm i.d. C18 analytical column. Extraction recovery, matrix effect and limits of detection were evaluated. Our comprehensive protocol demonstrates a simple and efficient sample preparation procedure followed by the analysis of peptides with limits of detection in the mid-picomolar range. This generic approach can be applied for the determination of most therapeutic peptides and possibly for endogenous peptides with latest state-of-the-art instruments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Infertility treatments are a major source of the increase in multiple pregnancies (MPs). AIMS: The aims of the present study were (1.) to investigate the origin and maternal/neonatal outcomes of MP and (2.) to review the different measures that can be adopted to reduce these serious complications. METHODS: The study included all women with multiple births between 1 January 1995 and 31 December 2006 at the University Hospital of Bern, Switzerland. The outcomes associated with the various origins of MP (natural conception, ovarian stimulation [OS] ‒ in-vitro fertilisation [IVF-ICSI]) were analysed using a multinomial logistic regression model. An analysis of the Swiss law on reproductive medicine and its current proposed revision, as well as a literature review using Pubmed, was carried out. RESULTS: A total of 592 MP were registered, 91% (n = 537) resulted in live births. There was significantly more neonatal/maternal morbidity in MP after OS compared with natural conception and even with the IVF-ICSI group. With a policy of elective single embryo transfer (eSET), twin rates after IVF-ICSI can be reduced to <5% and triplets to <1%. CONCLUSIONS: After OS, more triplets are found and the outcome of MP is worse. MP is known to be associated with morbidity, mortality, and economic and social risks. To counteract these complications (1.) better training for physicians performing OS should be encouraged and (2.) the Swiss law on reproductive medicine needs to be changed, with the introduction of eSET policies. This would lead to a dramatic decrease in neonatal and maternal morbidity/mortality as well as significant cost reductions for the Swiss healthcare system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on a plenary lecture presented at the Tenth ANZFSS meeting of Forensic science in Sydney (September 2010), this article identifies some of the difficulties arising from the confrontation of forensic science with the law : a science defined by its specialities rather its object (the trace) and through the eyes of the law rather than those of science. This situation has historical roots that are highlighted and potential solutions for the future lie in fundamental and cultural developments within forensic science itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies random matrix theory to obtain analytical characterizations of the capacity of correlated multiantenna channels. The analysis is not restricted to the popular separable correlation model, but rather it embraces a more general representation that subsumesmost of the channel models that have been treated in the literature. For arbitrary signal-to-noise ratios (SNR), the characterization is conducted in the regime of large numbers of antennas. For the low- and high-SNR regions, in turn, we uncover compact capacity expansions that are valid for arbitrary numbers of antennas and that shed insight on how antenna correlation impacts the tradeoffs between power, bandwidth and rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most economic interactions happen in a context of sequential exchange in which innocent third parties suffer information asymmetry with respect to previous "originative" contracts. The law reduces transaction costs by protecting these third parties but preserves some element of consent by property right holders to avoid damaging property enforcement-e.g., it is they as principals who authorize agents in originative contracts. Judicial verifiability of these originative contracts is obtained either as an automatic byproduct of transactions or, when these would have remained private, by requiring them to be made public. Protecting third parties produces a sort of legal commodity which is easy to trade impersonally, improving the allocation and specialization of resources. Historical delay in generalizing this legal commoditization paradigm is attributed to path dependency-the law first developed for personal trade-and an unbalance in vested interests, as luddite legal professionals face weak public bureaucracies.