985 resultados para Machine-tool
Resumo:
The 15-minute family interview is a condensed form of the Calgary Family Assessment and Intervention Models (CFAM and CFIM) that aims to contribute to the establishment of a therapeutic relationship between nurses and family and to implement interventions to promote health and suffering relief, even during brief interactions. This study investigated the experience of nurses from the Family Health Strategy (FHS) who used the 15-minute interview on postpartum home. The qualitative research was conducted in three stages: participants' training program, utilization of the 15-minute family interview by participants, and interviews with nurses. The data were collected through semi-structured interviews with eight nurses. The thematic analysis revealed two main themes: dealing with the challenge of a new practice and evaluating the assignment. This work shows that this tool can be used to deepen relationships between nurses and families in the Family Health Strategy.
Resumo:
[Table des matières] 1. Introduction to the control banding method : Nanomaterials and occupational risk assessment; Alternative method known as control banding; Scope and limits of control banding. - 2. Control banding process applied to manufactured nanomaterials: General points; Operating principle. - 3. Implementation of control banding: Gathering of information; Hazard bands; Exposure bands; Allocation of risk control bands. - 4. Bibliography: Publications; Books, reports, opinions, bulletins; Standards and references; Legislation and regulations; Websites. - Annexes
Resumo:
En este artículo se presentan los resultados y conclusiones del trabajo deinvestigación llevado a cabo sobre herramientas informáticas para representación de grafos de autómatas de estado finitos. El principal resultado de esta investigación es el desarrollo de una nueva herramienta, que permita dibujar el grafo de forma totalmente automática, partiendo de una tabla de transiciones donde se describe al autómata en cuestión.
Resumo:
OBJECTIVE: A new tool to quantify visceral adipose tissue (VAT) over the android region of a total body dual-energy x-ray absorptiometry (DXA) scan has recently been reported. The measurement, CoreScan, is currently available on Lunar iDXA densitometers. The purpose of the study was to determine the precision of the CoreScan VAT measurement, which is critical for understanding the utility of this measure in longitudinal trials. DESIGN AND METHODS: VAT precision was characterized in both an anthropomorphic imaging phantom (measured on 10 Lunar iDXA systems) and a clinical population consisting of obese women (n = 32). RESULTS: The intrascanner precision for the VAT phantom across 9 quantities of VAT mass (0-1,800 g) ranged from 28.4 to 38.0 g. The interscanner precision ranged from 24.7 to 38.4 g. There was no statistical dependence on the quantity of VAT for either the inter- or intrascanner precision result (p = 0.670). Combining inter- and intrascanner precision yielded a total phantom precision estimate of 47.6 g for VAT mass, which corresponds to a 4.8% coefficient of variance (CV) for a 1 kg VAT mass. Our clinical population, who completed replicate total body scans with repositioning between scans, showed a precision of 56.8 g on an average VAT mass of 1110.4 g. This corresponds to a 5.1% CV. Hence, the in vivo precision result was similar to the phantom precision result. CONCLUSIONS: The study suggests that CoreScan has a relatively low precision error in both phantoms and obese women and therefore may be a useful addition to clinical trials where interventions are targeted towards changes in visceral adiposity.
Resumo:
The planning tools you need to improve both your farming operation and Iowa’s streams and lakes are right at your fingertips. With DNR interactive mapping online, you can access a large amount of information for free and without special software.
Resumo:
Este trabalho versa sobre a análise de rentabilidade através da margem de contribuição e tem por objectivo evidenciar como a margem de contribuição pode ser utilizada pelos gestores, na rentabilidade dos clientes, dos produtos, dos canais de distribuição e dos segmentos de mercado em que as empresas operam. O estudo está assente num referencial teórico que faz incursões nos principais conceitos conexos ao tema principal nomeadamente: análise custo – volume – resultado, métodos apuramento dos resultados na óptica da absorção e da contribuição, e a curva ABC. Na aplicação prática, procurámos utilizar alguns instrumentos de gestão nomeadamente o ponto de equilíbrio, apuramento dos resultados por segmentos de clientes, negócios e territórios de comercialização e a aplicação da curva de experiência ABC. O caso de estudo incide sobre a empresa nacional de moagem (Moave S.A) e restringe-se ao plano de produção e embalagem através do pequeno ensaque. O processo de colecta de dados foi obtido directamente das informações fornecidas pela Direcção da empresa, nomeadamente os produtos a serem embalados, a estrutura de custos, o plano de produção, a capacidade máxima da máquina e o calendário de trabalho diário e mensal. Os resultados evidenciaram que a utilização da margem de contribuição como ferramenta de gestão, constitui um importante instrumento de análise e avaliação da rentabilidade dos produtos, dos clientes, dos canais de distribuição bem como dos territórios de comercialização. This work focuses on the analysis of profitability through contribution margin and aims to highlight the contribution margin can be used by managers, in the profitability of customers, products, distribution channels and market segments in which companies operate. The study is based on the referential theoretical that makes inroads in key concepts related to the main topic, namely: cost analysis – volume – result, methods of clearance results in optical absorption and contribution, and ABC. In practice, we try to use some management tools in the balance, such as breakeven point, clearance of results by segments of customers, business and marketing territories and the application of experience curve ABC. The case study focuses on the national milling company (Moave s.a), and restricts itself to plan production and packaging through small bagging. The process of collecting data was obtained directly from the information provided by the management of the undertaking, in particular the products to be packed, the cost structure, the production plan, the maximum capacity of the machine and the daily work schedule and monthly. The results showed that the use of the contribution margin as a management tool constitutes an important instrument of analysis and assessment of the profitability of products, customers, distribution channels and marketing territories.
Resumo:
The molecular diagnosis of retinal dystrophies (RD) is difficult because of genetic and clinical heterogeneity. Previously, the molecular screening of genes was done one by one, sometimes in a scheme based on the frequency of sequence variants and the number of exons/length of the candidate genes. Payment for these procedures was complicated and the sequential billing of several genes created endless paperwork. We therefore evaluated the costs of generating and sequencing a hybridization-based DNA library enriched for the 64 most frequently mutated genes in RD, called IROme, and compared them to the costs of amplifying and sequencing these genes by the Sanger method. The production cost generated by the high-throughput (HT) sequencing of IROme was established at CHF 2,875.75 per case. Sanger sequencing of the same exons cost CHF 69,399.02. Turnaround time of the analysis was 3 days for IROme. For Sanger sequencing, it could only be estimated, as we never sequenced all 64 genes in one single patient. Sale cost for IROme calculated on the basis of the sale cost of one exon by Sanger sequencing is CHF 8,445.88, which corresponds to the sale price of 40 exons. In conclusion, IROme is cheaper and faster than Sanger sequencing and therefore represents a sound approach for the diagnosis of RD, both scientifically and economically. As a drop in the costs of HT sequencing is anticipated, target resequencing might become the new gold standard in the molecular diagnosis of RD.
Resumo:
The morphological and functional diversity of astrocytes, and their essential contribution in physiological and pathological conditions, are starting to emerge. However, experimental systems to investigate neuron-glia interactions and develop innovative approaches for the treatment of central nervous system (CNS) disorders are still very limited. Fluorescent reporter genes have been used to visualize populations of astrocytes and produce an atlas of gene expression in the brain. Knock-down or knock-out of astrocytic proteins using transgenesis have also been developed, but these techniques remain complex and time-consuming. Viral vectors have been developed to overexpress or silence genes of interest as they can be used for both in vitro and in vivo studies in adult mammalian species. In most cases, high transduction efficiency and long-term transgene expression are observed in neurons but there is limited expression in astrocytes. Several strategies have been developed to shift the tropism of lentiviral vectors (LV) and allow local and controlled gene expression in glial cells. In this review, we describe how modifications of the interaction between the LV envelope glycoprotein and the surface receptor molecules on target cells, or the integration of cell-specific promoters and miRNA post-transcriptional regulatory elements have been used to selectively express transgenes in astrocytes.
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
BACKGROUND AND PURPOSE: MCI was recently subdivided into sd-aMCI, sd-fMCI, and md-aMCI. The current investigation aimed to discriminate between MCI subtypes by using DTI. MATERIALS AND METHODS: Sixty-six prospective participants were included: 18 with sd-aMCI, 13 with sd-fMCI, and 35 with md-aMCI. Statistics included group comparisons using TBSS and individual classification using SVMs. RESULTS: The group-level analysis revealed a decrease in FA in md-aMCI versus sd-aMCI in an extensive bilateral, right-dominant network, and a more pronounced reduction of FA in md-aMCI compared with sd-fMCI in right inferior fronto-occipital fasciculus and inferior longitudinal fasciculus. The comparison between sd-fMCI and sd-aMCI, as well as the analysis of the other diffusion parameters, yielded no significant group differences. The individual-level SVM analysis provided discrimination between the MCI subtypes with accuracies around 97%. The major limitation is the relatively small number of cases of MCI. CONCLUSIONS: Our data show that, at the group level, the md-aMCI subgroup has the most pronounced damage in white matter integrity. Individually, SVM analysis of white matter FA provided highly accurate classification of MCI subtypes.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been adopted as part of the U.S. Environmental Protection Agency’s BASINS (Better Assessment Science Integrating Point & Nonpoint Sources) software package and is being used by many U.S. federal and state agencies, including the USDA within the Conservation Effects Assessment Project. At present, over 250 peer-reviewed, published articles have been identified that report SWAT applications, reviews of SWAT components, or other research that includes SWAT. Many of these peer-reviewed articles are summarized here according to relevant application categories such as streamflow calibration and related hydrologic analyses, climate change impacts on hydrology, pollutant load assessments, comparisons with other models, and sensitivity analyses and calibration techniques. Strengths and weaknesses of the model are presented, and recommended research needs for SWAT are provided.