97 resultados para Combinatorial Designs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article designs what it calls a Credit-Risk Balance Sheet (the risk being that of default by customers), a tool which, in principle, can contribute to revealing, controlling and managing the bad debt risk arising from a company¿s commercial credit, whose amount can represent a significant proportion of both its current and total assets.To construct it, we start from the duality observed in any credit transaction of this nature, whose basic identity can be summed up as Credit = Risk. ¿Credit¿ is granted by a company to its customer, and can be ranked by quality (we suggest the credit scoring system) and ¿risk¿ can either be assumed (interiorised) by the company itself or transferred to third parties (exteriorised).What provides the approach that leads to us being able to talk with confidence of a real Credit-Risk Balance Sheet with its methodological robustness is that the dual vision of the credit transaction is not, as we demonstrate, merely a classificatory duality (a double risk-credit classification of reality) but rather a true causal relationship, that is, a risk-credit causal duality.Once said Credit-Risk Balance Sheet (which bears a certain structural similarity with the classic net asset balance sheet) has been built, and its methodological coherence demonstrated, its properties ¿static and dynamic¿ are studied.Analysis of the temporal evolution of the Credit-Risk Balance Sheet and of its applications will be the object of subsequent works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present our recent achievements in the growing and optical characterization of KYb(WO4)2 (hereafter KYbW) crystals and demonstrate laser operation in this stoichiometric material. Single crystals of KYbW with optimal crystalline quality have been grown by the top-seeded-solution growth slow-cooling method. The optical anisotropy of this monoclinic crystal has been characterized, locating the tensor of the optical indicatrix and measuring the dispersion of the principal values of the refractive indices as well as the thermo-optic coefficients. Sellmeier equations have been constructed valid in the visible and near-IR spectral range. Raman scattering has been used to determine the phonon energies of KYbW and a simple physical model is applied for classification of the lattice vibration modes. Spectroscopic studies (absorption and emission measurements at room and low temperature) have been carried out in the spectral region near 1 µm characteristic for the ytterbium transition. Energy positions of the Stark sublevels of the ground and the excited state manifolds have been determined and the vibronic substructure has been identified. The intrinsic lifetime of the upper laser level has been measured taking care to suppress the effect of reabsorption and the intrinsic quantum efficiency has been estimated. Lasing has been demonstrated near 1074 nm with 41% slope efficiency at room temperature using a 0.5 mm thin plate of KYbW. This laser material holds great promise for diode pumped high-power lasers, thin disk and waveguide designs as well as for ultrashort (ps/fs) pulse laser systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article designs what it calls a Credit-Risk Balance Sheet (the risk being that of default by customers), a tool which, in principle, can contribute to revealing, controlling and managing the bad debt risk arising from a company¿s commercial credit, whose amount can represent a significant proportion of both its current and total assets.To construct it, we start from the duality observed in any credit transaction of this nature, whose basic identity can be summed up as Credit = Risk. ¿Credit¿ is granted by a company to its customer, and can be ranked by quality (we suggest the credit scoring system) and ¿risk¿ can either be assumed (interiorised) by the company itself or transferred to third parties (exteriorised).What provides the approach that leads to us being able to talk with confidence of a real Credit-Risk Balance Sheet with its methodological robustness is that the dual vision of the credit transaction is not, as we demonstrate, merely a classificatory duality (a double risk-credit classification of reality) but rather a true causal relationship, that is, a risk-credit causal duality.Once said Credit-Risk Balance Sheet (which bears a certain structural similarity with the classic net asset balance sheet) has been built, and its methodological coherence demonstrated, its properties ¿static and dynamic¿ are studied.Analysis of the temporal evolution of the Credit-Risk Balance Sheet and of its applications will be the object of subsequent works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este artículo abordamos el uso y la importancia de las herramientas estadísticas que se utilizan principalmente en los estudios médicos del ámbito de la oncología y la hematología, pero aplicables a muchos otros campos tanto médicos como experimentales o industriales. El objetivo del presente trabajo es presentar de una manera clara y precisa la metodología estadística necesaria para analizar los datos obtenidos en los estudios rigurosa y concisamente en cuanto a las hipótesis de trabajo planteadas por los investigadores. La medida de la respuesta al tratamiento elegidas en al tipo de estudio elegido determinarán los métodos estadísticos que se utilizarán durante el análisis de los datos del estudio y también el tamaño de muestra. Mediante la correcta aplicación del análisis estadístico y de una adecuada planificación se puede determinar si la relación encontrada entre la exposición a un tratamiento y un resultado es casual o por el contrario, está sujeto a una relación no aleatoria que podría establecer una relación de causalidad. Hemos estudiado los principales tipos de diseño de los estudios médicos más utilizados, tales como ensayos clínicos y estudios observacionales (cohortes, casos y controles, estudios de prevalencia y estudios ecológicos). También se presenta una sección sobre el cálculo del tamaño muestral de los estudios y cómo calcularlo, ¿Qué prueba estadística debe utilizarse?, los aspectos sobre fuerza del efecto ¿odds ratio¿ (OR) y riesgo relativo (RR), el análisis de supervivencia. Se presentan ejemplos en la mayoría de secciones del artículo y bibliografía más relevante.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Hospitals in countries with public health systems have recently adopted organizational changes to improve efficiency and resource allocation, and reducing inappropriate hospitalizations has been established as an important goal. AIMS: Our goal was to describe the functioning of a Quick Diagnosis Unit in a Spanish public university hospital after evaluating 1,000 consecutive patients. We also aimed to ascertain the degree of satisfaction among Quick Diagnosis Unit patients and the costs of the model compared to conventional hospitalization practices. DESIGN: Observational, descriptive study. METHODS: Our sample comprised 1,000 patients evaluated between November 2008 and January 2010 in the Quick Diagnosis Unit of a tertiary university public hospital in Barcelona. Included patients were those who had potentially severe diseases and would normally require hospital admission for diagnosis but whose general condition allowed outpatient treatment. We analyzed several variables, including time to diagnosis, final diagnoses and hospitalizations avoided, and we also investigated the mean cost (as compared to conventional hospitalization) and the patients' satisfaction. RESULTS: In 88% of cases, the reasons for consultation were anemia, anorexia-cachexia syndrome, febrile syndrome, adenopathies, abdominal pain, chronic diarrhea and lung abnormalities. The most frequent diagnoses were cancer (18.8%; mainly colon cancer and lymphoma) and Iron-deficiency anemia (18%). The mean time to diagnosis was 9.2 days (range 1 to 19 days). An estimated 12.5 admissions/day in a one-year period (in the internal medicine department) were avoided. In a subgroup analysis, the mean cost per process (admission-discharge) for a conventional hospitalization was 3,416.13 Euros, while it was 735.65 Euros in the Quick Diagnosis Unit. Patients expressed a high degree of satisfaction with Quick Diagnosis Unit care. CONCLUSIONS: Quick Diagnosis Units represent a useful and cost-saving model for the diagnostic study of patients with potentially severe diseases. Future randomized study designs involving comparisons between controls and intervention groups would help elucidate the usefulness of Quick Diagnosis Units as an alternative to conventional hospitalization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this observational and descriptive study is to evaluate whether the diagnosis axis of a nursing interface terminology meets the content validity criterion of being nursing-phenomena oriented. Nursing diagnosis concepts were analyzed in terms of presence in the nursing literature, type of articles published and areas of disciplinary interest. The search strategy was conducted in three databases with limits in relation to period and languages. The final analysis included 287 nursing diagnosis concepts. The results showed that most of the concepts were identified in the scientific literature, with a homogeneous distribution of types of designs. Most of these concepts (87.7%) were studied from two or more areas of disciplinary interest. Validity studies on disciplinary controlled vocabularies may contribute to demonstrate the nursing influence on patients" outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe the effect of guanidinylation of the aminoglycoside moiety on acridine-neamine-containing ligands for the stem-loop structure located at the exon 10-5′-intron junction of Tau pre-mRNA, an important regulatory element of tau gene alternative splicing. On the basis of dynamic combinatorial chemistry experiments, ligands that combine guanidinoneamine and two different acridines were synthesized and their RNA-binding properties were compared with those of their amino precursors. Fluorescence titration experiments and UV-monitored melting curves revealed that guanidinylation has a positive effect both on the binding affinity and specificity of the ligands for the stemloop RNA, as well as on the stabilization of all RNA sequences evaluated, particularly some mutated sequences associated with the development of FTDP-17 tauopathy. However, this correlation between binding affinity and stabilization due to guanidinylation was only found in ligands containing a longer spacer between the acridine and guanidinoneamine moieties, since a shorter spacer produced the opposite effect (e.g. lower binding affinity and lower stabilization). Furthermore, spectroscopic studies suggest that ligand binding does not significantly change the overall RNA structure upon binding (circular dichroism) and that the acridine moiety might intercalate near the bulged region of the stem->loop structure (UV-Vis and NMR spectroscopy).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducció: Teodorico (2004) ens diu que amb el joc cooperatiu els alumnes poden valorar, compartir i reflexionar sobre la relació que estableixen amb els altres companys. Per això, l’objectiu de la meva investigació és avaluar l’efecte que pot tenir l’aplicació d’una unitat de programació, centrada en el joc cooperatiu, en la relació que s’estableix entre els alumnes de 5è de primària, focalitzant l’atenció amb el líder i el menys acceptat del grup-classe. Mètodes: La intervenció s’avalua amb un disseny quasi-experimental pre-post amb grup control. I des del punt de vista dels dissenys observacionals podem parlar d’un disseny de seguiment, idiogràfic i multidimensional. El total de la mostra d’aquesta investigació era de 48 alumnes, dividit en el grup experimental (25) i en el grup control (23). La durada de la intervenció va ser de 8 setmanes i es va utilitzar un qüestionari ad hoc per la confecció dels sociograma i un instrument observacional ad hoc de les relacions de grup-classe. Resultats: El grup experimental estava format per 4 grups d’alumnes, al final de la unitat només se’n formaven 2, en canvi, el grup control, tan abans com després, estava format pels mateixos grups d’alumnes. Un dels menys acceptats del grup experimental ha millorat la relació amb alguns dels companys, en canvi els menys acceptat del grup control no ha millorat cap tipus de relació. Conclusions: Crec que els jocs cooperatius són una bona eina educativa per intentar millorar les relacions, ja que amb més temps crec que els resultats d’aquesta investigació haguessin sigut més positius.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our work is focused on alleviating the workload for designers of adaptive courses on the complexity task of authoring adaptive learning designs adjusted to specific user characteristics and the user context. We propose an adaptation platform that consists in a set of intelligent agents where each agent carries out an independent adaptation task. The agents apply machine learning techniques to support the user modelling for the adaptation process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The restricted maximum likelihood is preferred by many to the full maximumlikelihood for estimation with variance component and other randomcoefficientmodels, because the variance estimator is unbiased. It is shown that thisunbiasednessis accompanied in some balanced designs by an inflation of the meansquared error.An estimator of the cluster-level variance that is uniformly moreefficient than the fullmaximum likelihood is derived. Estimators of the variance ratio are alsostudied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process variations are a major bottleneck for digital CMOS integrated circuits manufacturability and yield. That iswhy regular techniques with different degrees of regularity are emerging as possible solutions. Our proposal is a new regular layout design technique called Via-Configurable Transistors Array (VCTA) that pushes to the limit circuit layout regularity for devices and interconnects in order to maximize regularity benefits. VCTA is predicted to perform worse than the Standard Cell approach designs for a certain technology node but it will allow the use of a future technology on an earlier time. Ourobjective is to optimize VCTA for it to be comparable to the Standard Cell design in an older technology. Simulations for the first unoptimized version of our VCTA of delay and energy consumption for a Full Adder circuit in the 90 nm technology node are presented and also the extrapolation for Carry-RippleAdders from 4 bits to 64 bits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study the reconstruction of a network topology from the values of its betweenness centrality, a measure of the influence of each of its nodes in the dissemination of information over the network. We consider a simple metaheuristic, simulated annealing, as the combinatorial optimization method to generate the network from the values of the betweenness centrality. We compare the performance of this technique when reconstructing different categories of networks –random, regular, small-world, scale-free and clustered–. We show that the method allows an exact reconstruction of small networks and leads to good topological approximations in the case of networks with larger orders. The method can be used to generate a quasi-optimal topology fora communication network from a list with the values of the maximum allowable traffic for each node.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we investigate the average andoutage performance of spatial multiplexing multiple-input multiple-output (MIMO) systems with channel state information at both sides of the link. Such systems result, for example, from exploiting the channel eigenmodes in multiantenna systems. Dueto the complexity of obtaining the exact expression for the average bit error rate (BER) and the outage probability, we deriveapproximations in the high signal-to-noise ratio (SNR) regime assuming an uncorrelated Rayleigh flat-fading channel. Moreexactly, capitalizing on previous work by Wang and Giannakis, the average BER and outage probability versus SNR curves ofspatial multiplexing MIMO systems are characterized in terms of two key parameters: the array gain and the diversity gain. Finally, these results are applied to analyze the performance of a variety of linear MIMO transceiver designs available in the literature.