92 resultados para Laser damage threshold
Resumo:
The field of laser application to the restoration and cleaning of cultural assets is amongst the most thriving developments of recent times. Ablative laser technological systems are able to clean and protect inestimable works of art subject to atmospheric agents and degradation over time. This new technology, which has been developing for the last forty year, is now available to restorers and has received a significant success all over Europe. An important contribution in the process of laser innovation has been carried out in Florence by local actors belonging to a creative cluster. The objects of the analysis are the genesis of this innovation in this local Florentine context, and the relationships among the main actors who have contributed in it. The study investigates how culture can play a part in the generation of ideas and innovations, and which are the creative environments that can favour it. In this context, the issue of laser technologies for the restoration of cultural heritage has been analysed as a case study in the various paths taken by the Creative Capacity of the Culture (CCC).
Resumo:
This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.
Resumo:
In this paper we consider extensions of smooth transition autoregressive (STAR) models to situations where the threshold is a time-varying function of variables that affect the separation of regimes of the time series under consideration. Our specification is motivated by the observation that unusually high/low values for an economic variable may sometimes be best thought of in relative terms. State-dependent logistic STAR and contemporaneous-threshold STAR models are introduced and discussed. These models are also used to investigate the dynamics of U.S. short-term interest rates, where the threshold is allowed to be a function of past output growth and inflation.
Resumo:
The present project has performed the study and development of a new technique for the detection of gases with range resolution. This technique called FMCW-lidar is a technique that evolves from the FMCW-radar technique to be applied to lidar systems. Moreover, it takes advantage of the appearance of spectral absorption lines because of the interaction between light and gases to tune the light wavelength of a laser emitter with one of this spectral lines and then detects the backscattered light and analyzes it in order to obtain gas concentration measurements. The first part of the project consisted in the analysis of the WMS technique which is a technique for the in-situ measurement of gases. A complete theoretical analysis has been performed and some experiments have been carried out in order to test the technique and to validate its application to an FMCW-modulated system for the detection of gases. The second part of the project consisted in the analysis of the lidar FMCW technique for solid target detection and its extension to continuous media. The classical form of this technique has been analyzed for a distributed medium and a filtering effect has been found which prevents the accurate acquisition of the medium response. A modification of the technique has been proposed and a validation via simulations and some experiments has been carried on. After performing these tests, a novel system is proposed to be developed and tested in order to perform the indicated gas detection with range resolution.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Background and Purpose Early prediction of motor outcome is of interest in stroke management. We aimed to determine whether lesion location at DTT is predictive of motor outcome after acute stroke and whether this information improves the predictive accuracy of the clinical scores. Methods We evaluated 60 consecutive patients within 12 hours of MCA stroke onset. We used DTT to evaluate CST involvement in the MC and PMC, CS, CR, and PLIC and in combinations of these regions at admission, at day 3, and at day 30. Severity of limb weakness was assessed using the m-NIHSS (5a, 5b, 6a, 6b). We calculated volumes of infarct and FA values in the CST of the pons. Results Acute damage to the PLIC was the best predictor associated with poor motor outcome, axonal damage, and clinical severity at admission (P&.001). There was no significant correlation between acute infarct volume and motor outcome at day 90 (P=.176, r=0.485). The sensitivity, specificity, and positive and negative predictive values of acute CST involvement at the level of the PLIC for 4 motor outcome at day 90 were 73.7%, 100%, 100%, and 89.1%, respectively. In the acute stage, DTT predicted motor outcome at day 90 better than the clinical scores (R2=75.50, F=80.09, P&.001). Conclusions In the acute setting, DTT is promising for stroke mapping to predict motor outcome. Acute CST damage at the level of the PLIC is a significant predictor of unfavorable motor outcome.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
This paper discusses inference in self exciting threshold autoregressive (SETAR)models. Of main interest is inference for the threshold parameter. It iswell-known that the asymptotics of the corresponding estimator depend uponwhether the SETAR model is continuous or not. In the continuous case, thelimiting distribution is normal and standard inference is possible. Inthe discontinuous case, the limiting distribution is non-normal and cannotbe estimated consistently. We show valid inference can be drawn by theuse of the subsampling method. Moreover, the method can even be extendedto situations where the (dis)continuity of the model is unknown. In thiscase, also the inference for the regression parameters of the modelbecomes difficult and subsampling can be used advantageously there aswell. In addition, we consider an hypothesis test for the continuity ofthe SETAR model. A simulation study examines small sample performance.
Resumo:
Most facility location decision models ignore the fact that for a facility to survive it needs a minimum demand level to cover costs. In this paper we present a decision model for a firm thatwishes to enter a spatial market where there are several competitors already located. This market is such that for each outlet there is a demand threshold level that has to be achievedin order to survive. The firm wishes to know where to locate itsoutlets so as to maximize its market share taking into account the threshold level. It may happen that due to this new entrance, some competitors will not be able to meet the threshold and therefore will disappear. A formulation is presented together with a heuristic solution method and computational experience.
Resumo:
Every year, flash floods cause economic losses and major problems for undertaking daily activity in the Catalonia region (NE Spain). Sometimes catastrophic damage and casualties occur. When a long term analysis of floods is undertaken, a question arises regarding the changing role of the vulnerability and the hazard in risk evolution. This paper sets out to give some information to deal with this question, on the basis of analysis of all the floods that have occurred in Barcelona county (Catalonia) since the 14th century, as well as the flooded area, urban evolution, impacts and the weather conditions for any of most severe events. With this objective, the identification and classification of historical floods, and characterisation of flash-floods among these, have been undertaken. Besides this, the main meteorological factors associated with recent flash floods in this city and neighbouring regions are well-known. On the other hand, the identification of rainfall trends that could explain the historical evolution of flood hazard occurrence in this city has been analysed. Finally, identification of the influence of urban development on the vulnerability to floods has been carried out. Barcelona city has been selected thanks to its long continuous data series (daily rainfall data series, since 1854; one of the longest rainfall rate series of Europe, since 1921) and for the accurate historical archive information that is available (since the Roman Empire for the urban evolution). The evolution of flood occurrence shows the existence of oscillations in the earlier and later modern-age periods that can be attributed to climatic variability, evolution of the perception threshold and changes in vulnerability. A great increase of vulnerability can be assumed for the period 1850¿1900. The analysis of the time evolution for the Barcelona rainfall series (1854¿2000) shows that no trend exists, although, due to changes in urban planning, flash-floods impact has altered over this time. The number of catastrophic flash floods has diminished, although the extraordinary ones have increased.
Resumo:
The interconnected porosity of the Cr3C2-NiCr coatings obtained by high-velocity oxy fuel spraying is detrimental in corrosion and wear resistance applications. Laser treatments allow sealing of their surfaces through melting and resolidification of a thin superficial layer. A Nd:YAG laser beam was used to irradiate Cr3C2-NiCr coatings either in the continuous wave mode or at different repetition rates in the pulsed one. Results indicated that high peak and low mean laser irradiances are not good, since samples presented deep grooves and an extensive crack network. At low peak and higher mean laser irradiances the surface was molten, and only a few shallow cracks were observed. The interconnected porosity was completely eliminated in a layer up to 80 m thick, formed by large Cr7C3 grains imbedded in a NiCr matrix.