956 resultados para Effects Based Operations
Resumo:
In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.
Resumo:
The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.
Resumo:
Analog In-memory Computing (AIMC) has been proposed in the context of Beyond Von Neumann architectures as a valid strategy to reduce internal data transfers energy consumption and latency, and to improve compute efficiency. The aim of AIMC is to perform computations within the memory unit, typically leveraging the physical features of memory devices. Among resistive Non-volatile Memories (NVMs), Phase-change Memory (PCM) has become a promising technology due to its intrinsic capability to store multilevel data. Hence, PCM technology is currently investigated to enhance the possibilities and the applications of AIMC. This thesis aims at exploring the potential of new PCM-based architectures as in-memory computational accelerators. In a first step, a preliminar experimental characterization of PCM devices has been carried out in an AIMC perspective. PCM cells non-idealities, such as time-drift, noise, and non-linearity have been studied to develop a dedicated multilevel programming algorithm. Measurement-based simulations have been then employed to evaluate the feasibility of PCM-based operations in the fields of Deep Neural Networks (DNNs) and Structural Health Monitoring (SHM). Moreover, a first testchip has been designed and tested to evaluate the hardware implementation of Multiply-and-Accumulate (MAC) operations employing PCM cells. This prototype experimentally demonstrates the possibility to reach a 95% MAC accuracy with a circuit-level compensation of cells time drift and non-linearity. Finally, empirical circuit behavior models have been included in simulations to assess the use of this technology in specific DNN applications, and to enhance the potentiality of this innovative computation approach.
Resumo:
Identified charged pion, kaon, and proton spectra are used to explore the system size dependence of bulk freeze-out properties in Cu + Cu collisions at root s(NN) = 200 and 62.4 GeV. The data are studied with hydrodynamically motivated blast-wave and statistical model frameworks in order to characterize the freeze-out properties of the system. The dependence of freeze-out parameters on beam energy and collision centrality is discussed. Using the existing results from Au + Au and pp collisions, the dependence of freeze-out parameters on the system size is also explored. This multidimensional systematic study furthers our understanding of the QCD phase diagram revealing the importance of the initial geometrical overlap of the colliding ions. The analysis of Cu + Cu collisions expands the system size dependence studies from Au + Au data with detailed measurements in the smaller system. The systematic trends of the bulk freeze-out properties of charged particles is studied with respect to the total charged particle multiplicity at midrapidity, exploring the influence of initial state effects.
Resumo:
O processo de globalização, na esfera dos mercados financeiros, exigiu às instituições bancárias opções de investimento estratégico na plataforma internacional. O movimento de implantação dos bancos portugueses no estrangeiro acompanhou esse processo, permitindo a oferta de serviços bancários de captação e financiamento nos principais mercados de destino das exportações e emigração. A presente dissertação tem como objetivo o estudo do processo de internacionalização do setor bancário português centrado na seguinte questão geral de investigação: “Quais os fatores determinantes das variáveis que caraterizam a evolução do setor bancário português no exterior?” O desenvolvimento desta questão é conduzido através da construção de um modelo explicativo dos impactos de um conjunto de determinantes, selecionados a partir da revisão de literatura, sobre os indicadores que traduzem a dinâmica do negócio bancário no exterior. Neste contexto, pretendeu-se obter evidência empírica desses efeitos através de uma metodologia que consiste na estimação de modelos de dados em painel, utilizando uma amostra de seis bancos com relevância ao nível de investimento no mercado externo relativos ao período compreendido entre 2004 e 2014. Os resultados empíricos sugerem a existência de relações estatisticamente significativas entre as variáveis consideradas nos modelos. Foram encontrados indícios que associam consistentemente as variáveis emigração, Investimento Direto Estrangeiro, Produto Interno Bruto em Portugal e nos países de acolhimento, ativo bancário e inflação, com a evolução da atividade bancária no exterior. Adicionalmente, os resultados revelam que o desemprego e o rácio do crédito em relação ao ativo são estatisticamente significativos na sua influência sobre o indicador da rendibilidade dos bancos. Conclui-se que a significância dos fatores selecionados permite explicar o comportamento dos indicadores de negócio no exterior para os bancos estudados e, consequentemente, a validade do modelo de análise proposto. No entanto, não se exclui que outros elementos explicativos não ponderados no estudo tenham igualmente preponderância explicativa no processo de internacionalização do setor bancário.
Resumo:
This paper provides a general treatment of the implications for welfare of legal uncertainty. We distinguish legal uncertainty from decision errors: though the former can be influenced by the latter, the latter are neither necessary nor sufficient for the existence of legal uncertainty. We show that an increase in decision errors will always reduce welfare. However, for any given level of decision errors, information structures involving more legal uncertainty can improve welfare. This holds always, even when there is complete legal uncertainty, when sanctions on socially harmful actions are set at their optimal level. This transforms radically one’s perception about the “costs” of legal uncertainty. We also provide general proofs for two results, previously established under restrictive assumptions. The first is that Effects-Based enforcement procedures may welfare dominate Per Se (or object-based) procedures and will always do so when sanctions are optimally set. The second is that optimal sanctions may well be higher under enforcement procedures involving more legal uncertainty.
Resumo:
In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.
Resumo:
This paper studies the apparent contradiction between two strands of the literature on the effects of financial intermediation on economic activity. On the one hand, the empirical growth literature finds a positive effect of financial depth as measured by, for instance, private domestic credit and liquid liabilities (e.g., Levine, Loayza, and Beck 2000). On the other hand, the banking and currency crisis literature finds that monetary aggregates, such as domestic credit, are among the best predictors of crises and their related economic downturns (e.g., Kaminski and Reinhart 1999). The paper accounts for these contrasting effects based on the distinction between the short- and long-run impacts of financial intermediation. Working with a panel of cross-country and time-series observations, the paper estimates an encompassing model of short- and long-run effects using the Pooled Mean Group estimator developed by Pesaran, Shin, and Smith (1999). The conclusion from this analysis is that a positive long-run relationship between financial intermediation and output growth co-exists with a, mostly, negative short-run relationship. The paper further develops an explanation for these contrasting effects by relating them to recent theoretical models, by linking the estimated short-run effects to measures of financial fragility (namely, banking crises and financial volatility), and by jointly analyzing the effects of financial depth and fragility in classic panel growth regressions.
Resumo:
This paper studies the apparent contradiction between two strands of the literature on the effects of financial intermediation on economic activity. On the one hand, the empirical growth literature finds a positive effect of financial depth as measured by, for instance, private domestic credit and liquid liabilities (e.g., Levine, Loayza, and Beck 2000). On the other hand, the banking and currency crisis literature finds that monetary aggregates, such as domestic credit, are among the best predictors of crises and their related economic downturns (e.g., Kaminski and Reinhart 1999). The paper accounts for these contrasting effects based on the distinction between the short- and long-run impacts of financial intermediation. Working with a panel of cross-country and time-series observations, the paper estimates an encompassing model of short- and long-run effects using the Pooled Mean Group estimator developed by Pesaran, Shin, and Smith (1999). The conclusion from this analysis is that a positive long-run relationship between financial intermediation and output growth co-exists with a, mostly, negative short-run relationship. The paper further develops an explanation for these contrasting effects by relating them to recent theoretical models, by linking the estimated short-run effects to measures of financial fragility(namely, banking crises and financial volatility), and by jointly analyzing the effects of financial depth and fragility in classic panel growth regressions.
Resumo:
The primary objective of this project was to determine the effect of bridge width on deck cracking in bridges. Other parameters, such as bridge skew, girder spacing and type, abutment type, pier type, and number of bridge spans, were also studied. To achieve the above objectives, one bridge was selected for live-load and long-term testing. The data obtained from both field tests were used to calibrate a three-dimensional (3D) finite element model (FEM). Three different types of loading—live loading, thermal loading, and shrinkage loading—were applied. The predicted crack pattern from the FEM was compared to the crack pattern from bridge inspection results. A parametric study was conducted using the calibrated FEM. The general conclusions/recommendations are as follows: -- Longitudinal and diagonal cracking in the deck near the abutment on an integral abutment bridge is due to the temperature differences between the abutment and the deck. Although not likely to induce cracking, shrinkage of the deck concrete may further exacerbate cracks developed from thermal effects. -- Based upon a limited review of bridges in the Iowa DOT inventory, it appears that, regardless of bridge width, longitudinal and diagonal cracks are prevalent in integral abutment bridges but not in bridges with stub abutments. -- The parametric study results show that bridge width and skew have minimal effect on the strain in the deck bridge resulting from restrained thermal expansion. -- Pier type, girder type, girder spacing, and number of spans also appear to have no influence on the level of restrained thermal expansion strain in the deck near the abutment.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
Tutkimuksessa tarkastellaan vaikutusperusteisuuteen pohjautuvia uusia sotataitoa soveltavia konsepteja kuvaavia käsitteitä ja niiden sisältöä. Tutkimuksessa tuodaan myös esiin sellaisia syitä ja tekijöitä, jotka ovat johtaneet tai mahdollistaneet uusien konseptien synnyn ja kehittämisen. Asioita tarkastellaan nimenomaan vaikutusperusteisuuden näkökulmasta. Tarkastelun kohteena olevat käsitteet ovat EBO (Effect Based Operations), EBAO (Effect Based Approach to Operations), SOD (Systemic Operational Design) ja kokonaisvaltainen lähestymistapa, CA (Comprehensive Approach). Työn keskeisinä johtopäätöksinä esitetään seuraavat tutkimustulokset: Asevoimiin kohdistuneet rakenteelliset muutokset ovat johtaneet länsimaisissa asevoimissa joukkojen supistamiseen sekä uuden tyyppisen teknologisen toimintakyvyn ja toimintatapamallien luomiseen. Verkostokeskeinen sodankäynti ja erityisesti sen mahdollistama yhteinen tilannekuva toimivat vaikutusperusteisten konseptien mahdollistajana ja niiden ydinprosessien tukena erityisesti, kun asiaa tarkastellaan teknologiselta kannalta. Verkostosodankäynti, kuten nykypäivän talo-uselämäkin, nojaa nopeaan päätöksentekoon, kustannusten minimointiin ja teknologian luomiin mahdollisuuksiin verkottuneessa maailmassa. Taloudellisuusajattelu, tuhovaikutusten minimoinnit, omien tappioiden välttäminen ja niin edelleen, toistuvat eri yhteyksissä uusilla termeillä. Yhteinen nimittäjä on kustannustehokkuus. Mikään ei ole kuitenkaan perustavasti muuttunut. Kaikki se, mitä sodankäynti on ja on ollut, tulee säilymään.
Resumo:
Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.
Resumo:
Since first reported in 2005, mononuclear ruthenium water oxidation catalysts have attracted a great deal of attention due to their catalytic performance and synthetic flexibility. In particular, ligands coordinated to a Ru metal centre play an important role in the catalytic mechanisms, exhibiting significant impact on catalyst efficiency, stability and activity towards water oxidation. This review focuses on finding possible correlations between the ligand effects and activity of mononuclear Ru aqua and non-aqua complexes as water oxidation catalysts. The ligand effects highlighted in the text include the electronic nature of core ligands and their substituents, the trans–cis effect, steric hindrance and the strain effect, the net charge effect, the geometric arrangement of the aqua ligand and the supramolecular effects, e.g., hydrogen bonding and influence of a pendant base. The outcome is not always obvious at the present knowledge level. Deeper understanding of the ligand effects, based on new input data, is mandatory for further progress towards a rational development of novel catalysts featuring enhanced activity in water oxidation.
Resumo:
The standard kinetic theory for a nonrelativistic diluted gas is generalized in the spirit of the nonextensive statistic distribution introduced by Tsallis. The new formalism depends on an arbitrary q parameter measuring the degree of nonextensivity. In the limit q = 1, the extensive Maxwell-Boltzmann theory is recovered. Starting from a purely kinetic deduction of the velocity q-distribution function, the Boltzmann H-teorem is generalized for including the possibility of nonextensive out of equilibrium effects. Based on this investigation, it is proved that Tsallis' distribution is the necessary and sufficient condition defining a thermodynamic equilibrium state in the nonextensive context. This result follows naturally from the generalized transport equation and also from the extended H-theorem. Two physical applications of the nonextensive effects have been considered. Closed analytic expressions were obtained for the Doppler broadening of spectral lines from an excited gas, as well as, for the dispersion relations describing the eletrostatic oscillations in a diluted electronic plasma. In the later case, a comparison with the experimental results strongly suggests a Tsallis distribution with the q parameter smaller than unity. A complementary study is related to the thermodynamic behavior of a relativistic imperfect simple fluid. Using nonequilibrium thermodynamics, we show how the basic primary variables, namely: the energy momentum tensor, the particle and entropy fluxes depend on the several dissipative processes present in the fluid. The temperature variation law for this moving imperfect fluid is also obtained, and the Eckart and Landau-Lifshitz formulations are recovered as particular cases