982 resultados para Standard Work
Resumo:
The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.
Resumo:
Calf losses (CL, mortality and unwanted early slaughter) in veal production are of great economic importance and an indicator of welfare. The objective of the present study was to evaluate CL and the causes of death on farms with a specific animal welfare standard (SAW) which exceeds the Swiss statutory regulations. Risk factors for CL were identified based on information about management, housing, feeding, and medication. In total, 74 production cohorts (2783 calves) of 15 farms were investigated. CL was 3.6%, the main causes of death were digestive disorders (52%), followed by respiratory diseases (28%). Factors significantly associated with an increased risk for CL were a higher number of individual daily doses of antibiotics (DDA), insufficient wind deflection in winter, and male gender. For administration of antibiotics to all calves of the cohort, a DDA of 14-21 was associated with a decreased risk for CL compared to a DDA of 7-13 days.
Resumo:
Molybdenum isotopes are increasingly widely applied in Earth Sciences. They are primarily used to investigate the oxygenation of Earth's ocean and atmosphere. However, more and more fields of application are being developed, such as magmatic and hydrothermal processes, planetary sciences or the tracking of environmental pollution. Here, we present a proposal for a unifying presentation of Mo isotope ratios in the studies of mass-dependent isotope fractionation. We suggest that the δ98/95Mo of the NIST SRM 3134 be defined as +0.25‰. The rationale is that the vast majority of published data are presented relative to reference materials that are similar, but not identical, and that are all slightly lighter than NIST SRM 3134. Our proposed data presentation allows a direct first-order comparison of almost all old data with future work while referring to an international measurement standard. In particular, canonical δ98/95Mo values such as +2.3‰ for seawater and −0.7‰ for marine Fe–Mn precipitates can be kept for discussion. As recent publications show that the ocean molybdenum isotope signature is homogeneous, the IAPSO ocean water standard or any other open ocean water sample is suggested as a secondary measurement standard, with a defined δ98/95Mo value of +2.34 ± 0.10‰ (2s). Les isotopes du molybdène (Mo) sont de plus en plus largement utilisés dans les sciences de la Terre. Ils sont principalement utilisés pour étudier l'oxygénation de l'océan et de l'atmosphère de la Terre. Cependant, de plus en plus de domaines d'application sont en cours de développement, tels que ceux concernant les processus magmatiques et hydrothermaux, les sciences planétaires ou encore le suivi de la pollution environnementale. Ici, nous présentons une proposition de présentation unifiée des rapports isotopiques du Mo dans les études du fractionnement isotopique dépendant de la masse. Nous suggérons que le δ98/95Mo du NIST SRM 3134 soit définit comme étant égal à +0.25 ‰. La raison est que la grande majorité des données publiées sont présentés par rapport à des matériaux de référence qui sont similaires, mais pas identiques, et qui sont tous légèrement plus léger que le NIST SRM 3134. Notre proposition de présentation des données permet une comparaison directe au premier ordre de presque toutes les anciennes données avec les travaux futurs en se référant à un standard international. En particulier, les valeurs canoniques du δ98/95Mo comme celle de +2,3 ‰ pour l'eau de mer et de -0,7 ‰ pour les précipités de Fe-Mn marins peuvent être conservés pour la discussion. Comme les publications récentes montrent que la signature isotopique moyenne du molybdène de l'océan est homogène, le standard de l'eau océanique IAPSO ou tout autre échantillon d'eau provenant de l'océan ouvert sont proposé comme standards secondaires, avec une valeur définie du δ98/95 Mo de 2.34 ± 0.10 ‰ (2s).
Resumo:
Facebook is a medium of social interaction producing its own style. I study how users from Malaga create this style through phonic features of the local variety and how they reflect on the use of these features. I then analyse the use of non-standard features by users from Malaga and compare them to an oral corpus. Results demonstrate that social factors work differently in real and virtual speech. Facebook communication is seen as a style serving to create social meaning and to express linguistic identity.
Resumo:
Environmental data sets of pollutant concentrations in air, water, and soil frequently include unquantified sample values reported only as being below the analytical method detection limit. These values, referred to as censored values, should be considered in the estimation of distribution parameters as each represents some value of pollutant concentration between zero and the detection limit. Most of the currently accepted methods for estimating the population parameters of environmental data sets containing censored values rely upon the assumption of an underlying normal (or transformed normal) distribution. This assumption can result in unacceptable levels of error in parameter estimation due to the unbounded left tail of the normal distribution. With the beta distribution, which is bounded by the same range of a distribution of concentrations, $\rm\lbrack0\le x\le1\rbrack,$ parameter estimation errors resulting from improper distribution bounds are avoided. This work developed a method that uses the beta distribution to estimate population parameters from censored environmental data sets and evaluated its performance in comparison to currently accepted methods that rely upon an underlying normal (or transformed normal) distribution. Data sets were generated assuming typical values encountered in environmental pollutant evaluation for mean, standard deviation, and number of variates. For each set of model values, data sets were generated assuming that the data was distributed either normally, lognormally, or according to a beta distribution. For varying levels of censoring, two established methods of parameter estimation, regression on normal ordered statistics, and regression on lognormal ordered statistics, were used to estimate the known mean and standard deviation of each data set. The method developed for this study, employing a beta distribution assumption, was also used to estimate parameters and the relative accuracy of all three methods were compared. For data sets of all three distribution types, and for censoring levels up to 50%, the performance of the new method equaled, if not exceeded, the performance of the two established methods. Because of its robustness in parameter estimation regardless of distribution type or censoring level, the method employing the beta distribution should be considered for full development in estimating parameters for censored environmental data sets. ^
Resumo:
Over the last few years Facebook has become a widespread and continuously expanding medium of communication. Being a new medium of social interaction, Facebook produces its own communication style. My focus of analysis is how Facebook users from the city of Malaga create this style by means of phonic features typical of the Andalusian variety and how the users reflect on the use of these phonic features. This project is based on a theoretical framework which combines variationist sociolinguistics with CMC to study the emergence of a style peculiar of the online social networks. In a corpus of Facebook users from three zones of Malaga, I have analysed the use of non-standard phonic features and then compared them with the same features in a reference corpus collected on three beaches of Malaga. From this comparison it can be deduced that the analysed social and linguistic factors work differently in real and virtual speech. Due to these different uses we can consider the peculiar electronic communication of Facebook as a style constrained by the electronic medium. It is a style which serves the users to create social meaning and to express their linguistic identities.
Resumo:
BACKGROUND Infiltration procedures are a common treatment of lumbar radiculopathy. There is a wide variety of infiltration techniques without an established gold standard. Therefore, we compared the effectiveness of CT-guided transforaminal infiltrations versus anatomical landmark-guided transforaminal infiltrations at the lower lumbar spine in case of acute sciatica at L3-L5. METHODS A retrospective chart review was conducted of 107 outpatients treated between 2009 and 2011. All patients were diagnosed with lumbar radiculopathic pain secondary to disc herniation in L3-L5. A total of 52 patients received CT-guided transforaminal infiltrations; 55 patients received non-imaging-guided nerve root infiltrations. The therapeutic success was evaluated regarding number of physician contacts, duration of treatment, type of analgesics used and loss of work days. Defined endpoint was surgery at the lower lumbar spine. RESULTS In the CT group, patients needed significantly less oral analgesics (p < 0.001). Overall treatment duration and physician contacts were significantly lower in the CT group (p < 0.001 and 0.002) either. In the CT group, patients lost significant fewer work days due to incapacity (p < 0.001). Surgery had to be performed in 18.2 % of the non-imaging group patients (CT group: 1.9 %; p = 0.008). CONCLUSION This study shows that CT-guided periradicular infiltration in lumbosciatica caused by intervertebral disc herniation is significantly superior to non-imaging, anatomical landmark-guided infiltration, regarding the parameters investigated. The high number of treatment failures in the non-imaging group underlines the inferiority of this treatment concept.
Resumo:
We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.
Resumo:
India's Muslim community, which accounts for 14.4 percent of India’s vast population and is thus the largest of all religious minorities, has been the subject of considerable development discourse as Muslims have the lowest level of educational attainment and standard of living among socio-religious groups in the country. This study addresses the meaning of education and career opportunities for Muslim youths in relation to their educational credentials and social position in the hierarchy of Muslim class and caste groups, with particular reference to a community in Uttar Pradesh. The author contends that the career opportunities, possibilities, and strategies of Muslim youths in Indian society depend on multiple factors: social hierarchy, opportunities to utilize economic resources, social networks, cultural capital, and the wider structural disparities within which the Muslims are situated and wherein they question the value of higher education in gaining them admission to socially recognized and established employment sectors.
Resumo:
The high integration density of current nanometer technologies allows the implementation of complex floating-point applications in a single FPGA. In this work the intrinsic complexity of floating-point operators is addressed targeting configurable devices and making design decisions providing the most suitable performance-standard compliance trade-offs. A set of floating-point libraries composed of adder/subtracter, multiplier, divisor, square root, exponential, logarithm and power function are presented. Each library has been designed taking into account special characteristics of current FPGAs, and with this purpose we have adapted the IEEE floating-point standard (software-oriented) to a custom FPGA-oriented format. Extended experimental results validate the design decisions made and prove the usefulness of reducing the format complexity
Resumo:
Knowledge of the uncertainty of measurement of testing results is important when results have to be compared with limits and specifications. In the measurement of sound insulation following standards UNE EN ISO 140-4 the uncertainty of the final magnitude is mainly associated to the average sound pressure levels L1 and L2 measured. A parameter that allows us to quantify the spatial variation of the sound pressure level is the standard deviation of the pressure levels measured at different points of the room. In this work, for a wide number of measurements following standards UNE EN ISO 140-4 we analyzed qualitatively the behaviour of the standard deviation for L1 and L2. The study of sound fields in enclosed spaces is very difficult. There are a wide variety of rooms with different sound fields depending on factors as volume, geometry and materials. In general, we observe that the L1 and L2 standard deviations contain peaks and dips independent on characteristics of the rooms at single frequencies that could correspond to critical frequencies of walls, floors and windows or even to temporal alterations of the sound field. Also, in most measurements according to UNE EN ISO 140-4 a large similitude between L1 and L2 standard deviation is found. We believe that such result points to a coupled system between source and receiving rooms, mainly at low frequencies the shape of the L1 and L2 standard deviations is comparable to the velocity level standard deviation on a wall
Resumo:
The development of a global instability analysis code coupling a time-stepping approach, as applied to the solution of BiGlobal and TriGlobal instability analysis 1, 2 and finite-volume-based spatial discretization, as used in standard aerodynamics codes is presented. The key advantage of the time-stepping method over matrix-formulation approaches is that the former provides a solution to the computer-storage issues associated with the latter methodology. To-date both approaches are successfully in use to analyze instability in complex geometries, although their relative advantages have never been quantified. The ultimate goal of the present work is to address this issue in the context of spatial discretization schemes typically used in industry. The time-stepping approach of Chiba 3 has been implemented in conjunction with two direct numerical simulation algorithms, one based on the typically-used in this context high-order method and another based on low-order methods representative of those in common use in industry. The two codes have been validated with solutions of the BiGlobal EVP and it has been showed that small errors in the base flow do not have affect significantly the results. As a result, a three-dimensional compressible unsteady second-order code for global linear stability has been successfully developed based on finite-volume spatial discretization and time-stepping method with the ability to study complex geometries by means of unstructured and hybrid meshes
Resumo:
Workflows are increasingly used to manage and share scientific computations and methods. Workflow tools can be used to design, validate, execute and visualize scientific workflows and their execution results. Other tools manage workflow libraries or mine their contents. There has been a lot of recent work on workflow system integration as well as common workflow interlinguas, but the interoperability among workflow systems remains a challenge. Ideally, these tools would form a workflow ecosystem such that it should be possible to create a workflow with a tool, execute it with another, visualize it with another, and use yet another tool to mine a repository of such workflows or their executions. In this paper, we describe our approach to create a workflow ecosystem through the use of standard models for provenance (OPM and W3C PROV) and extensions (P-PLAN and OPMW) to represent workflows. The ecosystem integrates different workflow tools with diverse functions (workflow generation, execution, browsing, mining, and visualization) created by a variety of research groups. This is, to our knowledge, the first time that such a variety of workflow systems and functions are integrated.
Resumo:
Introdução: As precauções-padrão (PP) constituem um conjunto de medidas que têm como finalidade minimizar o risco de transmissão ocupacional de patógenos, sendo indispensável sua utilização por profissionais de saúde, sobretudo pelos enfermeiros. No entanto, a não adesão às PP constitui problemática amplamente discutida em todo o mundo. Embora haja diversos estudos brasileiros que visem avaliar a adesão às PP , ainda tem-se observado grande fragilidade no processo de construção e de validação dos instrumentos utilizados para avaliação deste construto. Objetivo: Realizar a adaptação cultural e validação da Compliance with Standard Precautions Scale (CSPS) para enfermeiros brasileiros. Metodologia: Trata-se de um estudo metodológico para a adaptação e validação da CSPS. Essa escala é composta por 20 itens com quatro opções de respostas, e destina-se a avaliar a adesão às PP. O processo de adaptação consistiu em Tradução, Consenso entre Juízes, Retrotradução e Validação Semântica. A primeira etapa foi a tradução do idioma original para o português do Brasil. Após foi realizado um comitê composto por sete juízes, a versão de consenso obtida na etapa anterior foi traduzida novamente para o idioma de origem. Foram avaliadas as propriedades psicométricas do instrumento, considerando-se as validades de face e de conteúdo, a validade de construto e a confiabilidade. A versão para o Português do Brasil da CSPS (CSPS-PB) foi aplicada em uma amostra de 300 enfermeiros que atuam na assistência a pacientes em um hospital localizado na cidade de São Paulo/SP. A confiabilidade foi avaliada por meio da consistência interna (alfa de Cronbach) e teste reteste (coeficiente de correlação intraclasse - ICC). Para a validação de construto, foi utilizada a comparação entre grupos diferentes, análise fatorial exploratória e análise fatorial confirmatória, segundo o Modelo de Equações Estruturais (SEM). Utilizou-se o software IBM® SPSS, 19.0. Para a análise fatorial confirmatória foi utilizado o módulo específico Analysis of Moment Structures (IBM® SPSS AMOS). Para a análise paralela utilizou-se o programa RanEigen Syntax. O nível de significância adotado foi ? = 0,05. Todos os aspectos éticos foram contemplados. Resultados: A tradução realizada por tradutores juramentados garantiu qualidade a esse processo. A validação de face e de conteúdo possibilitou a realização de modificações pertinentes e imperativas a fim de atender aos critérios de equivalências conceituais, idiomáticas, culturais e semânticas. Obteve-se ?=0,61 na avaliação da consistência interna, indicando confiabilidade satisfatória. O ICC indicou uma correlação de 0,87 quase perfeita para o teste reteste duas semanas após a primeira abordagem, conferindo estabilidade satisfatória. A validade de construto mostrou que a CSPS-PB foi capaz de discriminar as médias de adesão às PP entre grupos distintos referente à idade (F=5,15 p<=0,01), ao tempo de experiência clínica (F = 8,9 p<= 0,000) e a ter recebido treinamento (t = 2,48 p<=0,01). Na análise fatorial confirmatória, o modelo foi subidentificado. A análise fatorial exploratória indicou que todos os itens apresentaram cargas fatoriais adequadas (>=0,30), sendo identificados quatro fatores pela análise paralela. O total de variância explicada foi de 35,48%. Conclusão: A CSPS-PB, trata-se de um instrumento adequado, confiável e válido para medir a adesão às PP entre enfermeiros brasileiros
Resumo:
El objetivo del trabajo consiste en reutilizar el Treebank de dependencias EPECDEP (BDT) para construir el gold standard de la sintaxis superficial del euskera. El paso básico consiste en el estudio comparativo de los dos formalismos aplicados sobre el mismo corpus: el formalismo de la Gramática de Restricciones (Constraint Grammar, CG) y la Gramática de Dependencias (Dependency Grammar, DP). Como resultado de dicho estudio hemos establecido los criterios lingüísticos necesarios para derivar la funciones sintácticas en estilo CG. Dichos criterios han sido implementados y evaluados, así en el 75% de los casos se derivan automáticamente las funciones sintácticas para construir el gold standard.