869 resultados para Just-in-time


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Laser ablation of graphite has been carried out using 1.06mm radiation from a Q-switched Nd:YAG laser and the time of flight distribution of molecular C2 present in the resultant plasma is investigated in terms of distance from the target as well as laser fluences employing time resolved spectroscopic technique. At low laser fluences the intensities of the emission lines from C2 exhibit only single peak structure while beyond a threshold laser fluence, emission from C2 shows a twin peak distribution in time. The occurrence of the faster velocity component at higher laser fluences is explained as due to species generated from recombination processes while the delayed peak is attributed to dissociation of higher carbon clusters resulting in the generation of C2 molecule. Analysis of measured data provides a fairly complete picture of the evolution and dynamics of C2 species in the laser induced plasma from graphite.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Die Automobilindustrie reagiert mit Modularisierungsstrategien auf die zunehmende Produktkomplexität, getrieben durch die wachsenden Individualisierungsanforde-rungen auf der Kundenseite und der Modellpolitik mit neuen Fahrzeuganläufen. Die Hersteller verlagern die Materialbereitstellungskomplexität durch Outsourcing an die nächste Zulieferebene, den First Tier Suppliern, die seit Beginn der 90er Jahre zunehmend in Zulieferparks in unmittelbarer Werknähe integriert werden. Typische Merkmale eines klassischen Zulieferparks sind: Bereitstellung einer Halleninfrastruktur mit Infrastrukturdienstleistungen, Anlieferung der Teileumfänge im JIS-Verfahren (Just-in-Sequence = reihenfolgegenaue Synchronisation), lokale Wertschöpfung (Vormontagen, Sequenzierung) des Zulieferers, Vertragsbindung der First Tier Zulieferer für die Dauer eines Produktlebenszyklus und Einbindung eines Logistikdienstleisters. Teilweise werden zur Finanzierung Förderprojekte des öffent-lichen Sektors initiiert. Bisher fehlte eine wissenschaftliche Bearbeitung dieses Themas "Zulieferpark". In der Arbeit werden die in Europa entstandenen Zulieferparks näher untersucht, um Vor- und Nachteile dieses Logistikkonzeptes zu dokumentieren und Entwicklungs-trends aufzuzeigen. Abgeleitet aus diesen Erkenntnissen werden Optimierungs-ansätze aufgezeigt und konkrete Entwicklungspfade zur Verbesserung der Chancen-Risikoposition der Hauptakteure Automobilhersteller, Zulieferer und Logistikdienst-leister beschrieben. Die Arbeit gliedert sich in vier Haupteile, einer differenzierten Beschreibung der Ausgangssituation und den Entwicklungstrends in der Automobilindustrie, dem Vorgehensmodell, der Dokumentation der Analyseergebnisse und der Bewertung von Zulieferparkmodellen. Im Rahmen der Ergebnisdokumentation des Analyseteils werden vier Zulieferparkmodelle in detaillierten Fallstudien anschaulich dargestellt. Zur Erarbeitung der Analyseergebnisse wurde eine Befragung der Hauptakteure mittels strukturierten Fragebögen durchgeführt. Zur Erhebung von Branchentrends und zur relativen Bewertung der Parkmodelle wurden zusätzlich Experten befragt. Zur Segmentierung der Zulieferparklandschaft wurde die Methode der Netzwerk-analyse eingesetzt. Die relative Bewertung der Nutzenposition basiert auf der Nutzwertanalyse. Als Ergebnisse der Arbeit liegen vor: · Umfassende Analyse der Zulieferparklandschaft in Europa, · Segmentierung der Parks in Zulieferparkmodelle, Optimierungsansätze zur Verbesserung einer Win-Win-Situation der beteiligten Hauptakteure, · Relative Nutzenbewertung der Zulieferparkmodelle, · Entwicklungspfade für klassische Zulieferparks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The interaction of short intense laser pulses with atoms/molecules produces a multitude of highly nonlinear processes requiring a non-perturbative treatment. Detailed study of these highly nonlinear processes by numerically solving the time-dependent Schrodinger equation becomes a daunting task when the number of degrees of freedom is large. Also the coupling between the electronic and nuclear degrees of freedom further aggravates the computational problems. In the present work we show that the time-dependent Hartree (TDH) approximation, which neglects the correlation effects, gives unreliable description of the system dynamics both in the absence and presence of an external field. A theoretical framework is required that treats the electrons and nuclei on equal footing and fully quantum mechanically. To address this issue we discuss two approaches, namely the multicomponent density functional theory (MCDFT) and the multiconfiguration time-dependent Hartree (MCTDH) method, that go beyond the TDH approximation and describe the correlated electron-nuclear dynamics accurately. In the MCDFT framework, where the time-dependent electronic and nuclear densities are the basic variables, we discuss an algorithm to calculate the exact Kohn-Sham (KS) potentials for small model systems. By simulating the photodissociation process in a model hydrogen molecular ion, we show that the exact KS potentials contain all the many-body effects and give an insight into the system dynamics. In the MCTDH approach, the wave function is expanded as a sum of products of single-particle functions (SPFs). The MCTDH method is able to describe the electron-nuclear correlation effects as the SPFs and the expansion coefficients evolve in time and give an accurate description of the system dynamics. We show that the MCTDH method is suitable to study a variety of processes such as the fragmentation of molecules, high-order harmonic generation, the two-center interference effect, and the lochfrass effect. We discuss these phenomena in a model hydrogen molecular ion and a model hydrogen molecule. Inclusion of absorbing boundaries in the mean-field approximation and its consequences are discussed using the model hydrogen molecular ion. To this end, two types of calculations are considered: (i) a variational approach with a complex absorbing potential included in the full many-particle Hamiltonian and (ii) an approach in the spirit of time-dependent density functional theory (TDDFT), including complex absorbing potentials in the single-particle equations. It is elucidated that for small grids the TDDFT approach is superior to the variational approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Time series regression models are especially suitable in epidemiology for evaluating short-term effects of time-varying exposures on health. The problem is that potential for confounding in time series regression is very high. Thus, it is important that trend and seasonality are properly accounted for. Our paper reviews the statistical models commonly used in time-series regression methods, specially allowing for serial correlation, make them potentially useful for selected epidemiological purposes. In particular, we discuss the use of time-series regression for counts using a wide range Generalised Linear Models as well as Generalised Additive Models. In addition, recently critical points in using statistical software for GAM were stressed, and reanalyses of time series data on air pollution and health were performed in order to update already published. Applications are offered through an example on the relationship between asthma emergency admissions and photochemical air pollutants

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A compositional time series is obtained when a compositional data vector is observed at different points in time. Inherently, then, a compositional time series is a multivariate time series with important constraints on the variables observed at any instance in time. Although this type of data frequently occurs in situations of real practical interest, a trawl through the statistical literature reveals that research in the field is very much in its infancy and that many theoretical and empirical issues still remain to be addressed. Any appropriate statistical methodology for the analysis of compositional time series must take into account the constraints which are not allowed for by the usual statistical techniques available for analysing multivariate time series. One general approach to analyzing compositional time series consists in the application of an initial transform to break the positive and unit sum constraints, followed by the analysis of the transformed time series using multivariate ARIMA models. In this paper we discuss the use of the additive log-ratio, centred log-ratio and isometric log-ratio transforms. We also present results from an empirical study designed to explore how the selection of the initial transform affects subsequent multivariate ARIMA modelling as well as the quality of the forecasts

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evolution of compositions in time, space, temperature or other covariates is frequent in practice. For instance, the radioactive decomposition of a sample changes its composition with time. Some of the involved isotopes decompose into other isotopes of the sample, thus producing a transfer of mass from some components to other ones, but preserving the total mass present in the system. This evolution is traditionally modelled as a system of ordinary di erential equations of the mass of each component. However, this kind of evolution can be decomposed into a compositional change, expressed in terms of simplicial derivatives, and a mass evolution (constant in this example). A rst result is that the simplicial system of di erential equations is non-linear, despite of some subcompositions behaving linearly. The goal is to study the characteristics of such simplicial systems of di erential equa- tions such as linearity and stability. This is performed extracting the compositional dif ferential equations from the mass equations. Then, simplicial derivatives are expressed in coordinates of the simplex, thus reducing the problem to the standard theory of systems of di erential equations, including stability. The characterisation of stability of these non-linear systems relays on the linearisation of the system of di erential equations at the stationary point, if any. The eigenvelues of the linearised matrix and the associated behaviour of the orbits are the main tools. For a three component system, these orbits can be plotted both in coordinates of the simplex or in a ternary diagram. A characterisation of processes with transfer of mass in closed systems in terms of stability is thus concluded. Two examples are presented for illustration, one of them is a radioactive decay

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The work presented in this paper belongs to the power quality knowledge area and deals with the voltage sags in power transmission and distribution systems. Propagating throughout the power network, voltage sags can cause plenty of problems for domestic and industrial loads that can financially cost a lot. To impose penalties to responsible party and to improve monitoring and mitigation strategies, sags must be located in the power network. With such a worthwhile objective, this paper comes up with a new method for associating a sag waveform with its origin in transmission and distribution networks. It solves this problem through developing hybrid methods which hire multiway principal component analysis (MPCA) as a dimension reduction tool. MPCA reexpresses sag waveforms in a new subspace just in a few scores. We train some well-known classifiers with these scores and exploit them for classification of future sags. The capabilities of the proposed method for dimension reduction and classification are examined using the real data gathered from three substations in Catalonia, Spain. The obtained classification rates certify the goodness and powerfulness of the developed hybrid methods as brand-new tools for sag classification

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introducción: Autismo es un trastorno del desarrollo caracterizado por compromiso en interacción social, habilidades de lenguaje, presentando rituales con estereotipias. Sin tratamientos curativos, actualmente se buscan terapias alternativas. Un incremento de la literatura científica de terapias asistidas con animales se ha evidenciado, demostrando mejoría en pacientes autistas con la equinoterapia. Objetivo: Realizar una revisión sistemática de la literatura para evaluar efectividad de la equinoterapia en habilidades sociales y de lenguaje en niños autistas. Metodología: Revisión sistemática de la literatura de artículos obtenidos en bases de datos y Meta-buscadores que proporcionaron evidencia de equinoterapia en niños autistas. Tipo de artículos consultados: revisiones sistemáticas, meta análisis y ensayos clínicos. Trabajos publicados hasta 2013. En inglés y español. Se emplearon términos MeSH y EMTREE. Resultados: Cuatro artículos cumplieron criterios de inclusión y exclusión. Se analizaron los artículos individualmente, no se logró realizar un meta análisis por diferencias metodológicas entre los estudios. En total 85 sujetos fueron evaluados en dichos estudios. La equinoterapia en niños autistas evidenció mejoría en habilidades sociales y en las habilidades de lenguaje pre verbal. Discusión: La equinoterapia es prometedora en el manejo de niños autistas, los artículos evidencian consistentemente mejorías a nivel de habilidades sociales y de lenguaje. Debe ser considerado el tipo de paciente, el régimen de equinoterapia y la sostenibilidad de las mejoras. Conclusiones: Se necesitan nuevos estudio con un mayor rigor metodológico que permitan fortalecer la evidencia sobre la equinoterapia en niños con autismo y así poder realizar recomendaciones con un adecuado nivel de evidencia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Con el objetivo de conocer la influencia de los diferentes niveles de apalancamiento en el crecimiento de las empresas colombianas, surge la necesidad de responder la siguiente pregunta: ¿qué determina la elección de estructura de capital de las firmas? La regresión por cuantiles permite examinar toda la distribución de las firmas y no solo una medida de la tendencia central de la distribución de la estructura de capital. De esta manera se puede evaluar la importancia relativa de las diferentes variables explicativas en diferentes puntos de la distribución del apalancamiento de las firmas. Razón por la cual se utilizará esta aproximación; sin embargo también se utilizará el método de regresión para datos de panel (también llamado datos longitudinales) con efectos aleatorios, para comparar resultados, esto teniendo en cuenta que los datos no sólo varían entre observaciones sino también en el tiempo. De esta manera, aplicar el método de regresión por cuantiles, permite darle una mirada más profunda a la elección del nivel de apalancamiento, pues permite discriminar el efecto de las variables entre firmas altamente apalancadas y bajamente apalancadas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RESUMO: O conhecimento existe desde sempre, mesmo num estado latente condicionado algures e apenas à espera de um meio (de uma oportunidade) de se poder manifestar. O conhecimento é duplamente um fenómeno da consciência: porque dela procede num dado momento da sua vida e da sua história e porque só nela termina, aperfeiçoando-a e enriquecendo-a. O conhecimento está assim em constante mudança. À relativamente pouco tempo começou-se a falar de Gestão do Conhecimento e na altura foi muito associada às Tecnologias da Informação, como meio de colectar, processar e armazenar cada vez mais, maiores quantidades de informação. As Tecnologias da Informação têm tido, desde alguns anos para cá, um papel extremamente importante nas organizações, inicialmente foram adoptadas com o propósito de automatizar os processos operacionais das organizações, que suportam as suas actividades quotidianas e nestes últimos tempos as Tecnologias da Informação dentro das organizações têm evoluído rapidamente. Todo o conhecimento, mesmo até o menos relevante de uma determinada área de negócio, é fundamental para apoiar o processo de tomada de decisão. As organizações para atingirem melhores «performances» e conseguirem transcender as metas a que se propuseram inicialmente, tendem a munir-se de mais e melhores Sistemas de Informação, assim como, à utilização de várias metodologias e tecnologias hoje em dia disponíveis. Por conseguinte, nestes últimos anos, muitas organizações têm vindo a demonstrar uma necessidade crucial de integração de toda a sua informação, a qual está dispersa pelos diversos departamentos constituintes. Para que os gestores de topo (mas também para outros funcionários) possam ter disponível em tempo útil, informação pertinente, verdadeira e fiável dos negócios da organização que eles representam, precisam de ter acesso a bons Sistemas de Tecnologias de Informação. Numa acção de poderem agir mais eficazmente e eficientemente nas tomadas de decisão, por terem conseguido tirar por esses meios o máximo de proveito possível da informação, e assim, apresentarem melhores níveis de sucesso organizacionais. Também, os Sistemas de «Business Intelligence» e as Tecnologias da Informação a ele associadas, utilizam os dados existentes nas organizações para disponibilizar informação relevante para as tomadas de decisão. Mas, para poderem alcançar esses níveis tão satisfatórios, as organizações necessitam de recursos humanos, pois como podem elas serem competitivas sem Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 6 trabalhadores qualificados. Assim, surge a necessidade das organizações em recrutar os chamados hoje em dia “Trabalhadores do Conhecimento”, que são os indivíduos habilitados para interpretar as informações dentro de um domínio específico. Eles detectam problemas e identificam alternativas, com os seus conhecimentos e discernimento, eles trabalham para solucionar esses problemas, ajudando consideravelmente as organizações que representam. E, usando metodologias e tecnologias da Engenharia do Conhecimento como a modelação, criarem e gerirem um histórico de conhecimento, incluindo conhecimento tácito, sobre várias áreas de negócios da organização, que podem estar explícitos em modelos abstractos, que possam ser compreendidos e interpretados facilmente, por outros trabalhadores com níveis de competência equivalentes. ABSTRACT: Knowledge has always existed, even in a latent state conditioning somewhere and just waiting for a half (an opportunity) to be able to manifest. Knowledge is doubly a phenomenon of consciousness: because proceeds itself at one point in its life and its history and because solely itself ends, perfecting it and enriching it. The knowledge is so in constant change. In the relatively short time that it began to speak of Knowledge Management and at that time was very associated with Information Technologies, as a means to collect, process and store more and more, larger amounts of information. Information Technologies has had, from a few years back, an extremely important role in organizations, were initially adopted in order to automate the operational processes of organizations, that support their daily activities and in recent times Information Technologies within organizations has evolved rapidly. All the knowledge, even to the least relevant to a particular business area, is fundamental to support the process of decision making. The organizations to achieve better performances and to transcend the goals that were initially propose, tend to provide itself with more and better Information Systems, as well as, the use of various methodologies and technologies available today. Consequently, in recent years, many organizations have demonstrated a crucial need for integrating all their information, which is dispersed by the diver constituents departments. For top managers (but also for other employees) may have ready in time, pertinent, truthful and reliable information of the organization they represent, need access to good Information Technology Systems. In an action that they can act more effectively and efficiently in decision making, for having managed to get through these means the maximum possible advantage of the information, and so, present better levels of organizational success. Also, the systems of Business Intelligence and Information Technologies its associated, use existing data on organizations to provide relevant information for decision making. But, in order to achieve these levels as satisfactory, organizations need human resources, because how can they be competitive without skilled workers. Thus, arises the need for organizations to recruit called today “Knowledge Workers”, they are the individuals enable to interpret the information within a specific domain. They detect problems and identify alternatives, with their knowledge and discernment they work to solve these problems, helping considerably the organizations that represent. And, using Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 8 methodologies and technologies of Knowledge Engineering as modeling, create and manage a history of knowledge, including tacit knowledge, on various business areas of the organization, that can be explicit in the abstract models, that can be understood and interpreted easily, by other workers with equivalent levels of competence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses a study to investigate immediate recall of visual stimuli presented simultaneously or sequentially in time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate model simulations consistently show that in response to greenhouse gas forcing surface temperatures over land increase more rapidly than over sea. The enhanced warming over land is not simply a transient effect, since it is also present in equilibrium conditions. We examine 20 models from the IPCC AR4 database. The global land/sea warming ratio varies in the range 1.36–1.84, independent of global mean temperature change. In the presence of increasing radiative forcing, the warming ratio for a single model is fairly constant in time, implying that the land/sea temperature difference increases with time. The warming ratio varies with latitude, with a minimum in equatorial latitudes, and maxima in the subtropics. A simple explanation for these findings is provided, and comparisons are made with observations. For the low-latitude (40°S–40°N) mean, the models suggest a warming ratio of 1.51 ± 0.13, while recent observations suggest a ratio of 1.54 ± 0.09.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A high-resolution record of sea-level change spanning the past 1000 years is derived from foraminiferal and chronological analyses of a 2m thick salt-marsh peat sequence at Chezzetcook, Nova Scotia, Canada. Former mean tide level positions are reconstructed with a precision of +/- 0.055 in using a transfer function derived from distributions of modern salt-marsh foraminifera. Our age model for the core section older than 300 years is based on 19 AMS C-14 ages and takes into account the individual probability distributions of calibrated radiocarbon ages. The past 300 years is dated by pollen and the isotopes Pb-206, Pb-207, Pb-210, Cs-137 and Am-241. Between AD 1000 and AD 1800, relative sea level rose at a mean rate of 17cm per century. Apparent pre-industrial rises of sea level dated at AD 1500-1550 and AD 1700-1800 cannot be clearly distinguished when radiocarbon age errors are taken into account. Furthermore, they may be an artefact of fluctuations in atmospheric C-14 production. In the 19th century sea level rose at a mean rate of 1.6mm/yr. Between AD 1900 and AD 1920, sea-level rise accelerated to the modern mean rate of 3.2mm/yr. This acceleration corresponds in time with global temperature rise and may therefore be associated with recent global warming. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Phytoextraction, the use of plants to extract heavy metals from contaminated soils, could be an interesting alternative to conventional remediation technologies. However, calcareous soils with relatively high total metal contents are difficult to phytoremediate due to low soluble metal concentrations. Soil amendments such as ethylene diaminetetraacetate (EDTA) have been suggested to increase heavy metal bioavailability and uptake in aboveground plant parts. Strong persistence of EDTA and risks of leaching of potentially toxic metals and essential nutrients have led to research on easily biodegradable soilamendments such as citric acid. In our research, EDTA is regarded as a scientific benchmark with which degradable alternatives are compared for enhanced phytoextraction purposes. The effects of increasing doses of EDTA (0.1, 1, 10 mmol kg(-1) dry soil) and citric acid (0.01, 0.05,0.25,0.442, 0.5 mol kg(-1) dry soil) on bioavailable fractions of Cu, Zn, Cd, and Pb were assessed in one part of our study and results are presented in this article. The evolution of labile soil fractions of heavy metals over time was evaluated using water paste saturation extraction (similar to soluble fraction), extraction with 1 M NH4OAc at pH 7 (similar to exchangeable fraction), and extraction with 0.5 M NH4OAc + 0.5 M HOAc + 0.02 M EDTA atpH 4.65 (similar to potentially bioavailable fraction). Both citric acid and EDTA produced a rapid initial increase in labile heavy metal fractions. Metal mobilization remained constant in time for soils treated with EDTA, but metal fractions was noted for soils treated with citric acid. The half life of heavy metal mobilization by citric acid varied between 1.5 and 5.7 d. In the following article, the effect of heavy metal mobilization on uptake by Helianthus annutis will be presented.