951 resultados para JIT (Just In Time)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molteplici studi, portati a termine di recente in Europa ed oltreoceano, hanno focalizzato l’attenzione sulle problematiche indotte dal trasporto merci in ambito urbano e contribuito ad identificarne possibili soluzioni (city logistics). Le aree urbane, dovrebbero idealmente essere luoghi ove abitare, svolgere attività economiche, sociali e ricreative. Esse possono vedere compromessa la loro predisposizione a tali scopi anche a causa del crescente traffico delle merci, il cui trasporto è effettuato principalmente su gomma, per via delle brevi distanze da coprire e delle carenze infrastrutturali. I veicoli commerciali, ad eccezione di quelli di ultima generazione, incidono negativamente sulla qualità dell’ambiente urbano, generando inquinamento atmosferico e acustico. La politica del “just in time”, che prevede l’assenza di magazzini di stoccaggio delle merci, incrementa i movimenti commerciali. Nella presente tesi vengono trattati alcuni aspetti logistici di regolamentazione della sosta e degli accessi per i mezzi di trasporto merci, in grado di rendere più efficiente la distribuzione dei beni, mitigando le problematiche indotte dal traffico e, quindi, salvaguardando la qualità di vita nei centri cittadini.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the experimental methods commonly used to define the behaviour of a full scale system, dynamic tests are the most complete and efficient procedures. A dynamic test is an experimental process, which would define a set of characteristic parameters of the dynamic behaviour of the system, such as natural frequencies of the structure, mode shapes and the corresponding modal damping values associated. An assessment of these modal characteristics can be used both to verify the theoretical assumptions of the project, to monitor the performance of the structural system during its operational use. The thesis is structured in the following chapters: The first introductive chapter recalls some basic notions of dynamics of structure, focusing the discussion on the problem of systems with multiply degrees of freedom (MDOF), which can represent a generic real system under study, when it is excited with harmonic force or in free vibration. The second chapter is entirely centred on to the problem of dynamic identification process of a structure, if it is subjected to an experimental test in forced vibrations. It first describes the construction of FRF through classical FFT of the recorded signal. A different method, also in the frequency domain, is subsequently introduced; it allows accurately to compute the FRF using the geometric characteristics of the ellipse that represents the direct input-output comparison. The two methods are compared and then the attention is focused on some advantages of the proposed methodology. The third chapter focuses on the study of real structures when they are subjected to experimental test, where the force is not known, like in an ambient or impact test. In this analysis we decided to use the CWT, which allows a simultaneous investigation in the time and frequency domain of a generic signal x(t). The CWT is first introduced to process free oscillations, with excellent results both in terms of frequencies, dampings and vibration modes. The application in the case of ambient vibrations defines accurate modal parameters of the system, although on the damping some important observations should be made. The fourth chapter is still on the problem of post processing data acquired after a vibration test, but this time through the application of discrete wavelet transform (DWT). In the first part the results obtained by the DWT are compared with those obtained by the application of CWT. Particular attention is given to the use of DWT as a tool for filtering the recorded signal, in fact in case of ambient vibrations the signals are often affected by the presence of a significant level of noise. The fifth chapter focuses on another important aspect of the identification process: the model updating. In this chapter, starting from the modal parameters obtained from some environmental vibration tests, performed by the University of Porto in 2008 and the University of Sheffild on the Humber Bridge in England, a FE model of the bridge is defined, in order to define what type of model is able to capture more accurately the real dynamic behaviour of the bridge. The sixth chapter outlines the necessary conclusions of the presented research. They concern the application of a method in the frequency domain in order to evaluate the modal parameters of a structure and its advantages, the advantages in applying a procedure based on the use of wavelet transforms in the process of identification in tests with unknown input and finally the problem of 3D modeling of systems with many degrees of freedom and with different types of uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Increasing attention is being paid to improvement in undergraduate science, technology, engineering, and mathematics (STEM) education through increased adoption of research-based instructional strategies (RBIS), but high-quality measures of faculty instructional practice do not exist to monitor progress. Purpose/Hypothesis The measure of how well an implemented intervention follows the original is called fidelity of implementation. This theory was used to address the research questions: What is the fidelity of implementation of selected RBIS in engineering science courses? That is, how closely does engineering science classroom practice reflect the intentions of the original developers? Do the critical components that characterize an RBIS discriminate between engineering science faculty members who claimed use of the RBIS and those who did not? Design/Method A survey of 387 U.S. faculty teaching engineering science courses (e.g., statics, circuits, thermodynamics) included questions about class time spent on 16 critical components and use of 11 corresponding RBIS. Fidelity was quantified as the percentage of RBIS users who also spent time on corresponding critical components. Discrimination between users and nonusers was tested using chi square. Results Overall fidelity of the 11 RBIS ranged from 11% to 80% of users spending time on all required components. Fidelity was highest for RBIS with one required component: case-based teaching, just-in-time teaching, and inquiry learning. Thirteen of 16 critical components discriminated between users and nonusers for all RBIS to which they were mapped. Conclusions Results were consistent with initial mapping of critical components to RBIS. Fidelity of implementation is a potentially useful framework for future work in STEM undergraduate education.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Back-in-time debuggers are extremely useful tools for identifying the causes of bugs, as they allow us to inspect the past states of objects no longer present in the current execution stack. Unfortunately the "omniscient" approaches that try to remember all previous states are impractical because they either consume too much space or they are far too slow. Several approaches rely on heuristics to limit these penalties, but they ultimately end up throwing out too much relevant information. In this paper we propose a practical approach to back-in-time debugging that attempts to keep track of only the relevant past data. In contrast to other approaches, we keep object history information together with the regular objects in the application memory. Although seemingly counter-intuitive, this approach has the effect that past data that is not reachable from current application objects (and hence, no longer relevant) is automatically garbage collected. In this paper we describe the technical details of our approach, and we present benchmarks that demonstrate that memory consumption stays within practical bounds. Furthermore since our approach works at the virtual machine level, the performance penalty is significantly better than with other approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional debugging tools present developers with means to explore the run-time context in which an error has occurred. In many cases this is enough to help the developer discover the faulty source code and correct it. However, rather often errors occur due to code that has executed in the past, leaving certain objects in an inconsistent state. The actual run-time error only occurs when these inconsistent objects are used later in the program. So-called back-in-time debuggers help developers step back through earlier states of the program and explore execution contexts not available to conventional debuggers. Nevertheless, even back-in-time debuggers do not help answer the question, ``Where did this object come from?'' The Object-Flow Virtual Machine, which we have proposed in previous work, tracks the flow of objects to answer precisely such questions, but this VM does not provide dedicated debugging support to explore faulty programs. In this paper we present a novel debugger, called Compass, to navigate between conventional run-time stack-oriented control flow views and object flows. Compass enables a developer to effectively navigate from an object contributing to an error back-in-time through all the code that has touched the object. We present the design and implementation of Compass, and we demonstrate how flow-centric, back-in-time debugging can be used to effectively locate the source of hard-to-find bugs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Der Kommissionierprozess stellt im Rahmen der innerbetrieblichen Logistik - gerade auch im Hinblick auf Just-In-Time-Lieferungen und Fragen der Produkthaftung - einen zentralen Baustein des Material- und Informationsflusses in Unternehmen dar. Dabei ist die Wahl des Kommissioniersystems ausschlaggebend für die Optimierung der personal- und zeitaufwendigen Kommissioniervorgänge und dient damit zur Leistungssteigerung unter gleichzeitiger Reduzierung der Fehlerquote.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.