909 resultados para Lead-time and set-up optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

At present, there is little fundamental guidance available to assist contractors in choosing when to schedule saw cuts on joints. To conduct pavement finishing and sawing activities effectively, however, contractors need to know when a concrete mixture is going to reach initial set, or when the sawing window will open. Previous research investigated the use of the ultrasonic pulse velocity (UPV) method to predict the saw-cutting window for early entry sawing. The results indicated that the method has the potential to provide effective guidance to contractors as to when to conduct early entry sawing. The aim of this project was to conduct similar work to observe the correlation between initial setting and conventional sawing time. Sixteen construction sites were visited in Minnesota and Missouri over a two-year period. At each site, initial set was determined using a p-wave propagation technique with a commercial device. Calorimetric data were collected using a commercial semi-adiabatic device at a majority of the sites. Concrete samples were collected in front of the paver and tested using both methods with equipment that was set up next to the pavement during paving. The data collected revealed that the UPV method looks promising for early entry and conventional sawing in the field, both early entry and conventional sawing times can be predicted for the range of mixtures tested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Previous cross-sectional studies report that cognitive impairment is associated with poor psychosocial functioning in euthymic bipolar patients. There is a lack of long-term studies to determine the course of cognitive impairment and its impact on functional outcome. Method A total of 54 subjects were assessed at baseline and 6 years later; 28 had DSM-IV TR bipolar I or II disorder (recruited, at baseline, from a Lithium Clinic Program) and 26 were healthy matched controls. They were all assessed with a cognitive battery tapping into the main cognitive domains (executive function, attention, processing speed, verbal memory and visual memory) twice over a 6-year follow-up period. All patients were euthymic (Hamilton Rating Scale for Depression score lower than 8 and Young mania rating scale score lower than 6) for at least 3 months before both evaluations. At the end of follow-up, psychosocial functioning was also evaluated by means of the Functioning Assessment Short Test. RESULTS: Repeated-measures multivariate analysis of covariance showed that there were main effects of group in the executive domain, in the inhibition domain, in the processing speed domain, and in the verbal memory domain (p<0.04). Among the clinical factors, only longer illness duration was significantly related to slow processing (p=0.01), whereas strong relationships were observed between impoverished cognition along time and poorer psychosocial functioning (p<0.05). CONCLUSIONS: Executive functioning, inhibition, processing speed and verbal memory were impaired in euthymic bipolar out-patients. Although cognitive deficits remained stable on average throughout the follow-up, they had enduring negative effects on psychosocial adaptation of patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena on tehostaa ohutlevytuotteita valmistavan yrityksen toimintaa ja parantaa tuotannon ohjattavuutta. Tuotannon tehostamisessa keskitytään minimoimaan asetusajat, pienentämään eräkokoja ja investoimaan valmistusta helpottaviin työkoneisiin tai -laitteisiin. Tuotannon ohjattavuutta kehitetään helpottamalla visuaalista seurantaa, layout-muutoksilla ja FiFo-ohjauksen käyttöönotolla. Kaikilla kehitystoimenpiteillä pyritään lyhentämään tuotannon läpimenoaikaa ja vähentämään keskeneräisen tuotannon määrää. Tavoitteena on luoda pohja Lean-valmistukselle. Kehitystoimenpiteet keskitetään useimmiten pullonkauloiksi muodostuville osastoille. Kapeikkojen avaamisessa hyödynnetään CIM-projektin mukaisesti tehtäviä investointeja. Uudella robottisärmäysyksiköllä mahdollistetaan litistäminen ja särmääminen yhdellä asetuksella. Kokoonpano-osaston sijaintia muutettiin ja osaston layoutista tehtiin linjamainen. Tuotannonohjauksen kehittämiseksi luotiin yksinkertainen kolmen askeleen kehitysmalli.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena on tutkia ja kehittää menetelmä tuotekehitysprojektin ajalliselle ennustamiselle tuotteen siirtyessä tuotekehityksestä massatuotantoon. Ajallisen ennustamisen merkitys korostuu mitä lähemmäksi uuden tuotteen massatuotannon aloittaminen (ramp-up) tulee, koska strategiset päätökset koskien mm. uusia tuotantolinjoja, materiaalien- ja komponenttien tilaamisia sekä vahvistus asiakastoimitusten aloittamista täytyy tehdä jo paljon aikaisemmin.Työ aloitetaan tutkimalla rinnakkaista insinöörityötä (concurrent engineering) sekä suoritusten mittaamista (performance measurement), joiden sisältämistä ajattelumalleista, työkaluista ja tekniikoista hahmottuivat ajallisen ennustettavuuden onnistumisen edellytykset. Näitä olivat suunnitellun tuotteen ja tuotekehitysprosessin laatu sekä resurssien ja tiimien kompetenssit. Toisaalta ajalliseen ennustettavuuteen vaikuttavat myös projektien riippuvuudet ulkoisista toimittajista ja heidän aikatauluistaan.Teoreettisena viitekehyksenä käytetään Bradford L. Goldense:n luomaa mallia tuotekehityksen proaktiiviseksi mittaamiseksi sekä sovelletaan W. Edward Deming:in jatkuvan parantamisen silmukkaa. Työssä kehitetään Ramp-up Predictability konsepti, joka koostuu keskipitkän ja pitkän aikavälin ennustamisesta. Työhön ei kuulunut mallin käyttöönotto ja seuranta.Toimenpide ehdotuksena esitetään lisätutkimusta mittareiden keskinäisestä korrelaatioista ja niiden luotettavuudesta sekä mallien tarjoamista mahdollisuuksista muille tulosyksiköille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have investigated the phenomenon of deprivation in contemporary Switzerland through the adoption of a multidimensional, dynamic approach. By applying Self Organizing Maps (SOM) to a set of 33 non-monetary indicators from the 2009 wave of the Swiss Household Panel (SHP), we identified 13 prototypical forms (or clusters) of well-being, financial vulnerability, psycho-physiological fragility and deprivation within a topological dimensional space. Then new data from the previous waves (2003 to 2008) were classified by the SOM model, making it possible to estimate the weight of the different clusters in time and reconstruct the dynamics of stability and mobility of individuals within the map. Looking at the transition probabilities between year t and year t+1, we observed that the paths of mobility which catalyze the largest number of observations are those connecting clusters that are adjacent on the topological space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time’s meaning as a competitive factor has been growing significantly from 1970’s to 2000. Product life cycles have got significantly shorter, customers are not ready to keep their own inventories and fast changing markets are making it dangerous to count on forecasts. All these factors are forcing companies to shorten their lead times and answer to customer hopes on faster basis. Good delivery reliability is increasing its importance in making difference between competitors.In this thesis we get deep into KWH Pipe Ltd. Finland’s, and especially Vaasa factory’s, delivery reliability problems. Target is to find causes, which are causing problems in delivery reliability. In year 2007 has delivery reliability been 87 %, as management has set target in 95 %. Delivery reliability has been quite the same for few years now, and at the same time storage values have been growing up. The causes of poor delivery reliability were searched by analyzing company’s order-delivery-process using among other things root cause analysis. Furthermore the meters, which are used in company at the moment, are analyzed and some new meters are suggested to take in use. As outcome of process analysis is a list of confirmed problems in order of priority and by using this list it is possible to decide of actions, which are taken in future to bring delivery reliability to better level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many companies today struggle with problems they face around sales lead management. They are suffering from inconsistent quality of leads, they miss clear sales opportunities and even cannot handle well their internal marketing lists. Meanwhile customers are better and better equipped with means to easily initiate contact via internet, via call centers etc. Investing in lead generation activities that are built on a bad process is not a good idea. Better than asking how to get more leads, companies should ask how to get better quality leads and invest in improving lead management. This study looks sales lead management as a multi step process where a company generates leads in controlled environment, qualifies them and hands over to the sales cycle. As a final step, organization needs to analyze the incomes and successes of different lead sources. Most often in sales lead management a process improvement requires setting up additional controls to enable proper tracking of all leads. A sales lead management process model for the case company is built based on the findings. Implementing the new model involves changes and improvements in some key areas of current process. Starting from the very beginning, these include redefining a bit the lead definition and revising the criteria set for qualified lead. There are some improvements to be done in the system side to enable the proposed model. Lastly a setting for responsible roles is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To date, for most biological and physiological phenomena, the scientific community has reach a consensus on their related function, except for sleep, which has an undetermined, albeit mystery, function. To further our understanding of sleep function(s), we first focused on the level of complexity at which sleep-like phenomenon can be observed. This lead to the development of an in vitro model. The second approach was to understand the molecular and cellular pathways regulating sleep and wakefulness, using both our in vitro and in vivo models. The third approach (ongoing) is to look across evolution when sleep or wakefulness appears. (1) To address the question as to whether sleep is a cellular property and how this is linked to the entire brain functioning, we developed a model of sleep in vitro by using dissociated primary cortical cultures. We aimed at simulating the major characteristics of sleep and wakefulness in vitro. We have shown that mature cortical cultures display a spontaneous electrical activity similar to sleep. When these cultures are stimulated by waking neurotransmitters, they show a tonic firing activity, similar to wakefulness, but return spontaneously to the "sleep-like" state 24h after stimulation. We have also shown that transcriptional, electrophysiological, and metabolic correlates of sleep and wakefulness can be reliably detected in dissociated cortical cultures. (2) To further understand at which molecular and cellular levels changes between sleep and wakefulness occur, we have used a pharmacological and systematic gene transcription approach in vitro and discovered a major role played by the Erk pathway. Indeed, pharmacological inhibition of this pathway in living animals decreased sleep by 2 hours per day and consolidated both sleep and wakefulness by reducing their fragmentation. (3) Finally, we tried to evaluate the presence of sleep in one of the most primitive species with a neural network. We set up Hydra as a model organism. We hypothesized that sleep as a cellular (neuronal) property may occur with the appearance of the most primitive nervous system. We were able to show that Hydra have periodic rest phases amounting to up to 5 hours per day. In conclusion, our work established an in vitro model to study sleep, discovered one of the major signaling pathways regulating vigilance states, and strongly suggests that sleep is a cellular property highly conserved at the molecular level during evolution. -- Jusqu'à ce jour, la communauté scientifique s'est mise d'accord sur la fonction d'une majorité des processus physiologiques, excepté pour le sommeil. En effet, la fonction du sommeil reste un mystère, et aucun consensus n'est atteint le concernant. Pour mieux comprendre la ou les fonctions du sommeil, (1) nous nous sommes d'abord concentré sur le niveau de complexité auquel un état ressemblant au sommeil peut être observé. Nous avons ainsi développé un modèle du sommeil in vitro, (2) nous avons disséqué les mécanismes moléculaires et cellulaires qui pourraient réguler le sommeil, (3) nous avons cherché à savoir si un état de sommeil peut être trouvé dans l'hydre, l'animal le plus primitif avec un système nerveux. (1) Pour répondre à la question de savoir à quel niveau de complexité apparaît un état de sommeil ou d'éveil, nous avons développé un modèle du sommeil, en utilisant des cellules dissociées de cortex. Nous avons essayé de reproduire les corrélats du sommeil et de l'éveil in vitro. Pour ce faire, nous avons développé des cultures qui montrent les signes électrophysiologiques du sommeil, puis quand stimulées chimiquement passent à un état proche de l'éveil et retournent dans un état de sommeil 24 heures après la stimulation. Notre modèle n'est pas parfait, mais nous avons montré que nous pouvions obtenir les corrélats électrophysiologiques, transcriptionnels et métaboliques du sommeil dans des cellules corticales dissociées. (2) Pour mieux comprendre ce qui se passe au niveau moléculaire et cellulaire durant les différents états de vigilance, nous avons utilisé ce modèle in vitro pour disséquer les différentes voies de signalisation moléculaire. Nous avons donc bloqué pharmacologiquement les voies majeures. Nous avons mis en évidence la voie Erkl/2 qui joue un rôle majeur dans la régulation du sommeil et dans la transcription des gènes qui corrèlent avec le cycle veille-sommeil. En effet, l'inhibition pharmacologique de cette voie chez la souris diminue de 2 heures la quantité du sommeil journalier et consolide l'éveil et le sommeil en diminuant leur fragmentation. (3) Finalement, nous avons cherché la présence du sommeil chez l'Hydre. Pour cela, nous avons étudié le comportement de l'Hydre pendant 24-48h et montrons que des périodes d'inactivité, semblable au sommeil, sont présentes dans cette espèce primitive. L'ensemble de ces travaux indique que le sommeil est une propriété cellulaire, présent chez tout animal avec un système nerveux et régulé par une voie de signalisation phylogénétiquement conservée.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The illicit drug cutting represents a complex problem that requires the sharing of knowledge from addiction studies, toxicology, criminology and criminalistics. Therefore, cutting is not well known by the forensic community. Thus, this review aims at deciphering the different aspects of cutting, by gathering information mainly from criminology and criminalistics. It tackles essentially specificities of cocaine and heroin cutting. The article presents the detected cutting agents (adulterants and diluents), their evolution in time and space and the analytical methodology implemented by forensic laboratories. Furthermore, it discusses when, in the history of the illicit drug, cutting may take place. Moreover, researches studying how much cutting occurs in the country of destination are analysed. Lastly, the reasons for cutting are addressed. According to the literature, adulterants are added during production of the illicit drug or at a relatively high level of its distribution chain (e.g. before the product arrives in the country of destination or just after its importation in the latter). Their addition seems hardly justified by the only desire to increase profits or to harm consumers' health. Instead, adulteration would be performed to enhance or to mimic the illicit drug effects or to facilitate administration of the drug. Nowadays, caffeine, diltiazem, hydroxyzine, levamisole, lidocaïne and phenacetin are frequently detected in cocaine specimens, while paracetamol and caffeine are almost exclusively identified in heroin specimens. This may reveal differences in the respective structures of production and/or distribution of cocaine and heroin. As the relevant information about cutting is spread across different scientific fields, a close collaboration should be set up to collect essential and unified data to improve knowledge and provide information for monitoring, control and harm reduction purposes. More research, on several areas of investigation, should be carried out to gather relevant information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the scale invariants in the preparation of highly concentrated w/o emulsions at different scales and in varying conditions. The emulsions are characterized using rheological parameters, owing to their highly elastic behavior. We first construct and validate empirical models to describe the rheological properties. These models yield a reasonable prediction of experimental data. We then build an empirical scale-up model, to predict the preparation and composition conditions that have to be kept constant at each scale to prepare the same emulsion. For this purpose, three preparation scales with geometric similarity are used. The parameter N¿D^α, as a function of the stirring rate N, the scale (D, impeller diameter) and the exponent α (calculated empirically from the regression of all the experiments in the three scales), is defined as the scale invariant that needs to be optimized, once the dispersed phase of the emulsion, the surfactant concentration, and the dispersed phase addition time are set. As far as we know, no other study has obtained a scale invariant factor N¿Dα for the preparation of highly concentrated emulsions prepared at three different scales, which covers all three scales, different addition times and surfactant concentrations. The power law exponent obtained seems to indicate that the scale-up criterion for this system is the power input per unit volume (P/V).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents mathematical methods for evaluation of retail performance with special regard to product sourcing strategies. Forecast accuracy, process lead time, offshore / local sourcing mix and up front / replenishment buying mix are defined as critical success factors in connection with sourcing seasonal products with a fashion content. As success measures, this research focuses on service level, lost sales, product substitute percentage, gross margin, gross margin return on inventory and mark down rate. The accuracy of demand forecast is found to be a fundamental success factor. Forecast accuracy depends on lead time. Lead times are traditionally long and buying decisions are made seven to eight months prior to the start of the selling season. Forecast errors cause stockouts and lost sales. Some of the products bought for the selling season will not be sold and have to be marked down and sold at clearance, causing loss of gross margin. Gross margin percentage is not the best tool for evaluating sourcing decisions and in the context of this study gross margin return on inventory, which combines profitability and assets management, is used. The findings of this research suggest that there are more profitable ways of sourcing products than buying them from low cost offshore sources. Mixing up front and inseason replenishment deliveries, especially when point of sale information is used for improving forecast accuracy, results in better retail performance. Quick Response and Vendor Managed Inventory strategies yield better results than traditional up front buying from offshore even if local purchase prices are higher. Increasing the number of selling seasons, slight over buying for the season in order to

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joints intended for welding frequently show variations in geometry and position, for which it is unfortunately not possible to apply a single set of operating parameters to ensure constant quality. The cause of this difficulty lies in a number of factors, including inaccurate joint preparation and joint fit up, tack welds, as well as thermal distortion of the workpiece. In plasma arc keyhole welding of butt joints, deviations in the gap width may cause weld defects such as an incomplete weld bead, excessive penetration and burn through. Manual adjustment of welding parameters to compensate for variations in the gap width is very difficult, and unsatisfactory weld quality is often obtained. In this study a control system for plasma arc keyhole welding has been developed and used to study the effects of the real time control of welding parameters on gap tolerance during welding of austenitic stainless steel AISI 304L. The welding tests demonstrated the beneficial effect of real time control on weld quality. Compared with welding using constant parameters, the maximum tolerable gap width with an acceptable weld quality was 47% higher when using the real time controlled parameters for a plate thickness of 5 mm. In addition, burn through occurred with significantly larger gap widths when parameters were controlled in real time. Increased gap tolerance enables joints to be prepared and fit up less accurately, saving time and preparation costs for welding. In addition to the control system, a novel technique for back face monitoring is described in this study. The test results showed that the technique could be successfully applied for penetration monitoring when welding non magnetic materials. The results also imply that it is possible to measure the dimensions of the plasma efflux or weld root, and use this information in a feedback control system and, thus, maintain the required weld quality.