991 resultados para Tool path computing
Resumo:
The complex chemical and physical nature of combustion and secondary organic aerosols (SOAs) in general precludes the complete characterization of both bulk and interfacial components. The bulk composition reveals the history of the growth process and therefore the source region, whereas the interface controls--to a large extent--the interaction with gases, biological membranes, and solid supports. We summarize the development of a soft interrogation technique, using heterogeneous chemistry, for the interfacial functional groups of selected probe gases [N(CH(3))(3), NH(2)OH, CF(3)COOH, HCl, O(3), NO(2)] of different reactivity. The technique reveals the identity and density of surface functional groups. Examples include acidic and basic sites, olefinic and polycyclic aromatic hydrocarbon (PAH) sites, and partially and completely oxidized surface sites. We report on the surface composition and oxidation states of laboratory-generated aerosols and of aerosols sampled in several bus depots. In the latter case, the biomarker 8-hydroxy-2'-deoxyguanosine, signaling oxidative stress caused by aerosol exposure, was isolated. The increase in biomarker levels over a working day is correlated with the surface density N(i)(O3) of olefinic and/or PAH sites obtained from O(3) uptakes as well as with the initial uptake coefficient, γ(0), of five probe gases used in the field. This correlation with γ(0) suggests the idea of competing pathways occurring at the interface of the aerosol particles between the generation of reactive oxygen species (ROS) responsible for oxidative stress and cellular antioxidants.
Resumo:
Tavoitteena diplomityössä oli tutkia, miten hankintastrategiaa saataisiin paremmin toteutettua jokapäiväisessä työssä. Strategian implementointiin käytettiin strategiakarttaa sekä balanced scorecard -mittareita. Tutkimus toteutettiin hankintaosastolla Vaasan & Vaasan Oy:ssä, joka toimii leipomotoimialalla viidessä maassa Itämeren alueella. Hankinnan strategiakartta ja balanced scorecard -mittarit laadittiin olemassa olevan hankintastrategian pohjalta. Perinteisiä balanced scorecard -näkökulmia muokattiin paremmin hankintatoimeen sopiviksi. Malliin lisättiin toimittajanäkökulma ja hankinnan rooli pääprosessien tukifunktiona otettiin huomioon muuttamalla balanced scorecardin asiakasnäkökulma sisäisen asiakkaan näkökulmaksi. Strategiakartta myös laadittiin niin, että hankinnan sisäisten asiakkaiden odotukset ovat kartassa tasavertaisina taloudellistentavoitteiden kanssa. Balanced scorecard -mittareiden määrä pidettiin pienenä, jotta niiden toimintaa ohjaava vaikutus olisi mahdollisimman suuri. Mittareita voidaan vaihtaa strategian muuttuessa. Työssä todettiin, että strategiakartta ja balanced scorecard ovat hyviä välineitä strategian muuttamiseen konkreettisiksi toimenpiteiksi myös tukifunktiossa, kun esimerkiksi mallin näkökulmia muokataan tapauskohtaisesti. Strategiakartat ja balanced scorecard tulisi kytkeä osaksi koko yrityksen strategista suunnittelua. Strategiakartan ja balanced scorecardin käytöstä on hankinnalle monia hyötyjä. Työntekijät hahmottavat paremmin oman panoksensa yrityksen tavoitteiden saavuttamisessa. Käytetyt työkalut voivat tukea myös konsernin yksiköiden hankintojen keskittämiskehitystä.
Resumo:
Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.
Resumo:
This paper aims to present an ePortfolio project led for two years in a multilingual and interdisciplinary Master's program in public discourse and communication analysis offered by the Faculty of Arts of the University of Lausanne (Switzerland). Globally, the project - named Learn to communicate skills - offers a reflection about academic skills and their transferability to the professional world. More precisely, the aim of the project is to make students aware of the importance of reflexive learning to make their skills transferable to other contexts.
Resumo:
Although severe patient-ventilator asynchrony is frequent during invasive and non-invasive mechanical ventilation, diagnosing such asynchronies usually requires the presence at the bedside of an experienced clinician to assess the tracings displayed on the ventilator screen, thus explaining why evaluating patient-ventilator interaction remains a challenge in daily clinical practice. In the previous issue of Critical Care, Sinderby and colleagues present a new automated method to detect, quantify, and display patient-ventilator interaction. In this validation study, the automatic method is as efficient as experts in mechanical ventilation. This promising system could help clinicians extend their knowledge about patient-ventilator interaction and further improve assisted mechanical ventilation.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
ABSTRACT:¦BACKGROUND: The Spiritual Distress Assessment Tool (SDAT) is a 5-item instrument developed to assess unmet spiritual needs in hospitalized elderly patients and to determine the presence of spiritual distress. The objective of this study was to investigate the SDAT psychometric properties.¦METHODS: This cross-sectional study was performed in a Geriatric Rehabilitation Unit. Patients (N = 203), aged 65 years and over with Mini Mental State Exam score ≥ 20, were consecutively enrolled over a 6-month period. Data on health, functional, cognitive, affective and spiritual status were collected upon admission. Interviews using the SDAT (score from 0 to 15, higher scores indicating higher distress) were conducted by a trained chaplain. Factor analysis, measures of internal consistency (inter-item and item-to-total correlations, Cronbach α), and reliability (intra-rater and inter-rater) were performed. Criterion-related validity was assessed using the Functional Assessment of Chronic Illness Therapy-Spiritual well-being (FACIT-Sp) and the question "Are you at peace?" as criterion-standard. Concurrent and predictive validity were assessed using the Geriatric Depression Scale (GDS), occurrence of a family meeting, hospital length of stay (LOS) and destination at discharge.¦RESULTS: SDAT scores ranged from 1 to 11 (mean 5.6 ± 2.4). Overall, 65.0% (132/203) of the patients reported some spiritual distress on SDAT total score and 22.2% (45/203) reported at least one severe unmet spiritual need. A two-factor solution explained 60% of the variance. Inter-item correlations ranged from 0.11 to 0.41 (eight out of ten with P < 0.05). Item-to-total correlations ranged from 0.57 to 0.66 (all P < 0.001). Cronbach α was acceptable (0.60). Intra-rater and inter-rater reliabilities were high (Intraclass Correlation Coefficients ranging from 0.87 to 0.96). SDAT correlated significantly with the FACIT-Sp, "Are you at peace?", GDS (Rho -0.45, -0.33, and 0.43, respectively, all P < .001), and LOS (Rho 0.15, P = .03). Compared with patients showing no severely unmet spiritual need, patients with at least one severe unmet spiritual need had higher odds of occurrence of a family meeting (adjOR 4.7, 95%CI 1.4-16.3, P = .02) and were more often discharged to a nursing home (13.3% vs 3.8%; P = .027).¦CONCLUSIONS: SDAT has acceptable psychometrics properties and appears to be a valid and reliable instrument to assess spiritual distress in elderly hospitalized patients.
Resumo:
The aim of this research paper is to present a macroscopic study about the feasibility and the efficiency of mobile devices in computing Least-Cost Path (LCP). This kind of artifact must work in off-line mode and must allow to load data from a mountain zone like digital terrain models and meteorological data.The research strategy has two steps:- First of all, we need to identify the set of software components in order to implement them inside the IT artifact. This set of components should have to be able to do LCP calculations, visualize results and present a well adapted human interface. The main goal of this first steep is to demonstrate the feasibility of a mobile geographic information system by following the ¿Design & Creation¿ research strategy.- In a second time, the goal is to evaluate the reliability and usability of this IT artifact by an ¿Experiments¿ research approach. In this step we want to characterize the behavior of the artifact in terms of fidelity and LCP process speed. This evaluation will be carried out by some external users.During the reading of this paper, we will see that this kind of geographic information system (the IT artifact) has the minimal requirements needed to carry out LCP calculations in mobile devices although it has several limitations and constraints in terms of useability and reliability. We will point out qualitative and quantitative elements related to the IT artifact performances while doing this kind of computations.
Resumo:
The building industry has a particular interest in using clinching as a joining method for frame constructions of light-frame housing. Normally many clinch joints are required in joining of frames.In order to maximise the strength of the complete assembly, each clinch joint must be as sound as possible. Experimental testing is the main means of optimising a particular clinch joint. This includes shear strength testing and visual observation of joint cross-sections. The manufacturers of clinching equipment normally perform such experimental trials. Finite element analysis can also be used to optimise the tool geometry and the process parameter, X, which represents the thickness of the base of the joint. However, such procedures require dedicated software, a skilled operator, and test specimens in order to verify the finite element model. In addition, when using current technology several hours' computing time may be necessary. The objective of the study was to develop a simple calculation procedure for rapidly establishing an optimum value for the parameter X for a given tool combination. It should be possible to use the procedure on a daily basis, without stringent demands on the skill of the operator or the equipment. It is also desirable that the procedure would significantly decrease thenumber of shear strength tests required for verification. The experimental workinvolved tests in order to obtain an understanding of the behaviour of the sheets during clinching. The most notable observation concerned the stage of the process in which the upper sheet was initially bent, after which the deformation mechanism changed to shearing and elongation. The amount of deformation was measured relative to the original location of the upper sheet, and characterised as the C-measure. By understanding in detail the behaviour of the upper sheet, it waspossible to estimate a bending line function for the surface of the upper sheet. A procedure was developed, which makes it possible to estimate the process parameter X for each tool combination with a fixed die. The procedure is based on equating the volume of material on the punch side with the volume of the die. Detailed information concerning the behaviour of material on the punch side is required, assuming that the volume of die does not change during the process. The procedure was applied to shear strength testing of a sample material. The sample material was continuously hot-dip zinc-coated high-strength constructional steel,with a nominal thickness of 1.0 mm. The minimum Rp0.2 proof stress was 637 N/mm2. Such material has not yet been used extensively in light-frame housing, and little has been published on clinching of the material. The performance of the material is therefore of particular interest. Companies that use clinching on a daily basis stand to gain the greatest benefit from the procedure. By understanding the behaviour of sheets in different cases, it is possible to use data at an early stage for adjusting and optimising the process. In particular, the functionality of common tools can be increased since it is possible to characterise the complete range of existing tools. The study increases and broadens the amount ofbasic information concerning the clinching process. New approaches and points of view are presented and used for generating new knowledge.
Resumo:
Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.
Resumo:
The complexity of the connexions within an economic system can only be reliably reflected in academic research if powerful methods are used. Researchers have used Structural Path Analysis (SPA) to capture not only the linkages within the production system but also the propagation of the effects into different channels of impacts. However, the SPA literature has restricted itself to showing the relations among sectors of production, while the connections between these sectors and final consumption have attracted little attention. In order to consider the complete set of channels involved, in this paper we propose a structural path method that endogenously incorporates not only sectors of production but also the final consumption of the economy. The empirical application comprises water usages, and analyses the dissemination of exogenous impacts into various channels of water consumption. The results show that the responsibility for water stress is imputed to different sectors and depends on the hypothesis used for the role played by final consumption in the model. This highlights the importance of consumers’ decisions in the determination of ecological impacts. Keywords: Input-Output Analysis, Structural Path Analysis, Final Consumption, Water uses.
Resumo:
A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.