49 resultados para Parallel version
Resumo:
Descriu i analitza el procés d'integració europea des dels seus inicis, els anys cinquanta, fins als nostres dies. L'objectiu consisteix a oferir una visió de conjunt d'un procés que va començar modestament amb la integració dels mercats del carbó i de l'acer dels sis membres fundadors i que ha derivat en una Unió Europea de 27 estats membres i més de 500 milions d'habitants, amb polítiques comunes, una cooperació estreta en els àmbits més sensibles de la sobirania estatal i una moneda única en setze estats
Resumo:
Es tracta d'un projecte que proposa una aplicació per al calibratge automàtic de models P-sistema. Per a fer-ho primer es farà un estudi sobre els models P-sistema i el procediment seguit pels investigadors per desenvolupar aquest tipus de models. Es desenvoluparà una primera solució sèrie per al problema, i s'analitzaran els seus punts febles. Seguidament es proposarà una versió paral·lela que millori significativament el temps d'execució, tot mantenint una alta eficiència i escalabilitat.
Resumo:
This paper derives a model of markets with system goods and two technological standards. An established standard incurs lower unit production costs but causes a negative externality. The paper derives the conditions for policy intervention and compares the effect of direct and indirect cost-reducing subsidies in two markets with system goods in the presence of externalities. If consumers are committed to the technology by purchasing one of the components, direct subsidies are preferable. For a medium-low cost difference between technological standards and a low externality cost it is optimal to provide a direct subsidy only to the first technology adopter. As the higher the externality cost raises, the more technology adopters should be provided with direct subsidies. This effect is robust in all extensions. In the absence of consumers commitment to a technological standard indirect and direct subsidies are both desirable. In this case, the subsidy to the first adopter is lower then the subsidy to the second adopter. Moreover, for the low cost difference between technological standards and low externality cost the fi rst fi rm chooses a superior standard without policy intervention. Finally, a perfect compatibility between components based on different technological standards enhances an advantage of indirect subsidies for medium-high externality cost and cost difference between technological standards. Journal of Economic Literature Classi fication Numbers: C72, D21, D40, H23, L13, L22, L51, O25, O33, O38. Keywords: Technological standards; complementary products; externalities; cost-reducing subsidies; compatibility.
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
This paper analyzes the nature of health care provider choice inthe case of patient-initiated contacts, with special reference toa National Health Service setting, where monetary prices are zeroand general practitioners act as gatekeepers to publicly financedspecialized care. We focus our attention on the factors that mayexplain the continuously increasing use of hospital emergencyvisits as opposed to other provider alternatives. An extendedversion of a discrete choice model of demand for patient-initiatedcontacts is presented, allowing for individual and town residencesize differences in perceived quality (preferences) betweenalternative providers and including travel and waiting time asnon-monetary costs. Results of a nested multinomial logit model ofprovider choice are presented. Individual choice betweenalternatives considers, in a repeated nested structure, self-care,primary care, hospital and clinic emergency services. Welfareimplications and income effects are analyzed by computingcompensating variations, and by simulating the effects of userfees by levels of income. Results indicate that compensatingvariation per visit is higher than the direct marginal cost ofemergency visits, and consequently, emergency visits do not appearas an inefficient alternative even for non-urgent conditions.
Resumo:
Background: The NDI, COM and NPQ are evaluation instruments for disability due to NP. There was no Spanish version of NDI or COM for which psychometric characteristics were known. The objectives of this study were to translate and culturally adapt the Spanish version of the Neck Disability Index Questionnaire (NDI), and the Core Outcome Measure (COM), to validate its use in Spanish speaking patients with non-specific neck pain (NP), and to compare their psychometric characteristics with those of the Spanish version of the Northwick Pain Questionnaire (NPQ).Methods: Translation/re-translation of the English versions of the NDI and the COM was done blindly and independently by a multidisciplinary team. The study was done in 9 primary care Centers and 12 specialty services from 9 regions in Spain, with 221 acute, subacute and chronic patients who visited their physician for NP: 54 in the pilot phase and 167 in the validation phase. Neck pain (VAS), referred pain (VAS), disability (NDI, COM and NPQ), catastrophizing (CSQ) and quality of life (SF-12) were measured on their first visit and 14 days later. Patients' self-assessment was used as the external criterion for pain and disability. In the pilot phase, patients' understanding of each item in the NDI and COM was assessed, and on day 1 test-retest reliability was estimated by giving a second NDI and COM in which the name of the questionnaires and the order of the items had been changed.Results: Comprehensibility of NDI and COM were good. Minutes needed to fill out the questionnaires [median, (P25, P75)]: NDI. 4 (2.2, 10.0), COM: 2.1 (1.0, 4.9). Reliability: [ICC, (95%CI)]: NDI: 0.88 (0.80, 0.93). COM: 0.85 (0.75,0.91). Sensitivity to change: Effect size for patients having worsened, not changed and improved between days 1 and 15, according to the external criterion for disability: NDI: -0.24, 0.15, 0.66; NPQ: -0.14, 0.06, 0.67; COM: 0.05, 0.19, 0.92. Validity: Results of NDI, NPQ and COM were consistent with the external criterion for disability, whereas only those from NDI were consistent with the one for pain. Correlations with VAS, CSQ and SF-12 were similar for NDI and NPQ (absolute values between 0.36 and 0.50 on day 1, between 0.38 and 0.70 on day 15), and slightly lower for COM (between 0.36 and 0.48 on day 1, and between 0.33 and 0.61 on day 15). Correlation between NDI and NPQ: r = 0.84 on day 1, r = 0.91 on day 15. Correlation between COM and NPQ: r = 0.63 on day 1, r = 0.71 on day 15.Conclusion: Although most psychometric characteristics of NDI, NPQ and COM are similar, those from the latter one are worse and its use may lead to patients' evolution seeming more positive than it actually is. NDI seems to be the best instrument for measuring NP-related disability, since its results are the most consistent with patient's assessment of their own clinical status and evolution. It takes two more minutes to answer the NDI than to answer the COM, but it can be reliably filled out by the patient without assistance.
Resumo:
Background Obesity may have an impact on key aspects of health-related quality of life (HRQOL). In this context, the Impact of Weight Quality of Life (IWQOL) questionnaire was the first scale designed to assess HRQOL. The aim of the present study was twofold: to assess HRQOL in a sample of Spanish patients awaiting bariatric surgery and to determine the psychometric properties of the IWQOL-Lite and its sensitivity to detect differences in HRQOL across groups. Methods Participants were 109 obese adult patients (BMI¿ 35 kg/m2) from Barcelona, to whom the following measurement instruments were applied: IWQOL-Lite, Depression Anxiety Stress Scales, Brief Symptom Inventory, and self-perception items. Results Descriptive data regarding the IWQOL-Lite scores obtained by these patients are reported. Principal components analysis revealed a five-factor model accounting for 72.05% of the total variance, with factor loadings being adequate for all items. Corrected itemtotal correlations were acceptable for all items. Cronbach"s alpha coefficients were excellent both for the subscales (0.880.93) and the total scale (0.95). The relationship between the IWQOLLite and other variables supports the construct validity of the scale. Finally, sensitivity analysis revealed large effect sizes when comparing scores obtained by extreme BMI groups. Conclusions This is the first study to report the application of the IWQOL-Lite to a sample of Spanish patients awaiting bariatric surgery and to confirm that the Spanish version of the instrument has adequate psychometric properties.
Resumo:
The aim of this studywas to adapt and assess the psychometric properties of the Spanish version of the sMARS in terms of evidence of validity and reliability of scores. The sMARS was administered to 342 students and, in order to assess convergent and discriminant validity, several subsamples completed a series of related tests. The factorial structure of the sMARSwas analyzed by means of a confirmatory factor analysis and results showed that the three-factor structure reported in the original test fits well with the data. Thus, three dimensions were established in the test: math test, numerical task and math course anxiety. The results of this study provide sound evidence that demonstrates the good psychometric properties of the scores of the Spanish version of the sMARS: strong internal consistency, high 7-week testretest reliability and good convergent/discriminant validity were evident. Overall, this study provides an instrument that allows us to obtain valid and reliable math anxiety measurements. This instrument may be a useful tool for educators and psychologists interested in identifying individuals that may have a low level of math mastery because of their anxiety.
Resumo:
The coupling between topography, waves and currents in the surf zone may selforganize to produce the formation of shore-transverse or shore-oblique sand bars on an otherwise alongshore uniform beach. In the absence of shore-parallel bars, this has been shown by previous studies of linear stability analysis, but is now extended to the finite-amplitude regime. To this end, a nonlinear model coupling wave transformation and breaking, a shallow-water equations solver, sediment transport and bed updating is developed. The sediment flux consists of a stirring factor multiplied by the depthaveraged current plus a downslope correction. It is found that the cross-shore profile of the ratio of stirring factor to water depth together with the wave incidence angle primarily determine the shape and the type of bars, either transverse or oblique to the shore. In the latter case, they can open an acute angle against the current (upcurrent oriented) or with the current (down-current oriented). At the initial stages of development, both the intensity of the instability which is responsible for the formation of the bars and the damping due to downslope transport grow at a similar rate with bar amplitude, the former being somewhat stronger. As bars keep on growing, their finite-amplitude shape either enhances downslope transport or weakens the instability mechanism so that an equilibrium between both opposing tendencies occurs, leading to a final saturated amplitude. The overall shape of the saturated bars in plan view is similar to that of the small-amplitude ones. However, the final spacings may be up to a factor of 2 larger and final celerities can also be about a factor of 2 smaller or larger. In the case of alongshore migrating bars, the asymmetry of the longshore sections, the lee being steeper than the stoss, is well reproduced. Complex dynamics with merging and splitting of individual bars sometimes occur. Finally, in the case of shore-normal incidence the rip currents in the troughs between the bars are jet-like while the onshore return flow is wider and weaker as is observed in nature.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
This paper studies non-autonomous Lyness type recurrences of the form xn+2 = (an+xn+1)=xn, where fang is a k-periodic sequence of positive numbers with primitive period k. We show that for the cases k 2 f1; 2; 3; 6g the behavior of the sequence fxng is simple (integrable) while for the remaining cases satisfying this behavior can be much more complicated (chaotic). We also show that the cases where k is a multiple of 5 present some di erent features.
Resumo:
The factor structure of a back translated Spanish version (Lega, Caballo and Ellis, 2002) of the Attitudes and Beliefs Inventory (ABI) (Burgess, 1990) is analyzed in a sample of 250 university students.The Spanish version of the ABI is a 48-items self-report inventory using a 5-point Likert scale that assesses rational and irrational attitudes and beliefs. 24-items cover two dimensions of irrationality: a) areas of content (3 subscales), and b) styles of thinking (4 subscales).An Exploratory Factor Analysis (Parallel Analysis with Unweighted Least Squares method and Promin rotation) was performed with the FACTOR 9.20 software (Lorenzo-Seva and Ferrando, 2013).The results reproduced the main four styles of irrational thinking in relation with the three specific contents of irrational beliefs. However, two factors showed a complex configuration with important cross-loadings of different items in content and style. More analyses are needed to review the specific content and style of such items.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.