947 resultados para Quasi-likelihood estimator
Resumo:
Bahadur representation and its applications have attracted a large number of publications and presentations on a wide variety of problems. Mixing dependency is weak enough to describe the dependent structure of random variables, including observations in time series and longitudinal studies. This note proves the Bahadur representation of sample quantiles for strongly mixing random variables (including ½-mixing and Á-mixing) under very weak mixing coe±cients. As application, the asymptotic normality is derived. These results greatly improves those recently reported in literature.
Resumo:
Doutoramento em Gestão
Resumo:
This thesis studies mobile robotic manipulators, where one or more robot manipulator arms are integrated with a mobile robotic base. The base could be a wheeled or tracked vehicle, or it might be a multi-limbed locomotor. As robots are increasingly deployed in complex and unstructured environments, the need for mobile manipulation increases. Mobile robotic assistants have the potential to revolutionize human lives in a large variety of settings including home, industrial and outdoor environments.
Mobile Manipulation is the use or study of such mobile robots as they interact with physical objects in their environment. As compared to fixed base manipulators, mobile manipulators can take advantage of the base mechanism’s added degrees of freedom in the task planning and execution process. But their use also poses new problems in the analysis and control of base system stability, and the planning of coordinated base and arm motions. For mobile manipulators to be successfully and efficiently used, a thorough understanding of their kinematics, stability, and capabilities is required. Moreover, because mobile manipulators typically possess a large number of actuators, new and efficient methods to coordinate their large numbers of degrees of freedom are needed to make them practically deployable. This thesis develops new kinematic and stability analyses of mobile manipulation, and new algorithms to efficiently plan their motions.
I first develop detailed and novel descriptions of the kinematics governing the operation of multi- limbed legged robots working in the presence of gravity, and whose limbs may also be simultaneously used for manipulation. The fundamental stance constraint that arises from simple assumptions about friction and the ground contact and feasible motions is derived. Thereafter, a local relationship between joint motions and motions of the robot abdomen and reaching limbs is developed. Baseeon these relationships, one can define and analyze local kinematic qualities including limberness, wrench resistance and local dexterity. While previous researchers have noted the similarity between multi- fingered grasping and quasi-static manipulation, this thesis makes explicit connections between these two problems.
The kinematic expressions form the basis for a local motion planning problem that that determines the joint motions to achieve several simultaneous objectives while maintaining stance stability in the presence of gravity. This problem is translated into a convex quadratic program entitled the balanced priority solution, whose existence and uniqueness properties are developed. This problem is related in spirit to the classical redundancy resoxlution and task-priority approaches. With some simple modifications, this local planning and optimization problem can be extended to handle a large variety of goals and constraints that arise in mobile-manipulation. This local planning problem applies readily to other mobile bases including wheeled and articulated bases. This thesis describes the use of the local planning techniques to generate global plans, as well as for use within a feedback loop. The work in this thesis is motivated in part by many practical tasks involving the Surrogate and RoboSimian robots at NASA/JPL, and a large number of examples involving the two robots, both real and simulated, are provided.
Finally, this thesis provides an analysis of simultaneous force and motion control for multi- limbed legged robots. Starting with a classical linear stiffness relationship, an analysis of this problem for multiple point contacts is described. The local velocity planning problem is extended to include generation of forces, as well as to maintain stability using force-feedback. This thesis also provides a concise, novel definition of static stability, and proves some conditions under which it is satisfied.
Resumo:
This report discusses the calculation of analytic second-order bias techniques for the maximum likelihood estimates (for short, MLEs) of the unknown parameters of the distribution in quality and reliability analysis. It is well-known that the MLEs are widely used to estimate the unknown parameters of the probability distributions due to their various desirable properties; for example, the MLEs are asymptotically unbiased, consistent, and asymptotically normal. However, many of these properties depend on an extremely large sample sizes. Those properties, such as unbiasedness, may not be valid for small or even moderate sample sizes, which are more practical in real data applications. Therefore, some bias-corrected techniques for the MLEs are desired in practice, especially when the sample size is small. Two commonly used popular techniques to reduce the bias of the MLEs, are ‘preventive’ and ‘corrective’ approaches. They both can reduce the bias of the MLEs to order O(n−2), whereas the ‘preventive’ approach does not have an explicit closed form expression. Consequently, we mainly focus on the ‘corrective’ approach in this report. To illustrate the importance of the bias-correction in practice, we apply the bias-corrected method to two popular lifetime distributions: the inverse Lindley distribution and the weighted Lindley distribution. Numerical studies based on the two distributions show that the considered bias-corrected technique is highly recommended over other commonly used estimators without bias-correction. Therefore, special attention should be paid when we estimate the unknown parameters of the probability distributions under the scenario in which the sample size is small or moderate.
Resumo:
La tesis presenta evidencia rigurosa de la efectividad de las políticas públicas utilizando metodologías experimentales y cuasi-experimentales. La tesis comienza con una introducción completa y una revisión rigurosa de las metodologías que se utilizarán en el análisis posterior de los datos. El primer capítulo, "Habilidades personales y habilidades técnicas en programas de formación de jóvenes. Evidencia Experimental de Largo Plazo de República Dominicana ", evalúa el impacto de un programa de empleo de los jóvenes en una serie de variables de interés. El programa ofrece capacitación en las habilidades vocacionales y en las habilidades no cognitivas a jóvenes en riesgo de exclusión social. Cabe destacar que la metodología utilizada para evaluar el programa es un ensayo controlado aleatorio, que proporciona evidencia robusta del efecto causal del programa. Mientras que estudios previos analizaron el impacto de los programas para jóvenes relacionados, ningún estudio anterior había evaluado los efectos de 4 años después de la implementación del programa. Esto representa una contribución importante debido a que las ganancias a corto plazo de varios programas de desarrollo han demostrado no ser sostenida en el tiempo. Esto es también lo que este estudio encuentra para los resultados del mercado de trabajo: mientras que el programa genera una mejora a corto plazo de los resultados de empleo para las mujeres, este efecto se disipa en el largo plazo. Sin embargo, el programa parece conducir a cambios persistentes en las expectativas del mercado de trabajo de las mujeres: las mujeres que asistieron al entrenamiento de informar una visión más optimista de las perspectivas del mercado de trabajo hasta 4 años después del programa...
Resumo:
A number of laws in Canada which uphold rights are referred to as quasi-constitutional by the courts in recognition of their special importance. Quasi-constitutional statutes are enacted through the regular legislative process, although they are being interpreted and applied in a fashion which has become remarkably similar to constitutional law, and are therefore having an important affect over other legislation. Quasi-constitutionality has surprisingly received limited scholarly attention, and very few serious attempts at explaining its significance have been made. This dissertation undertakes a comprehensive study of quasi-constitutionality which considers its theoretical basis, its interpretation and legal significance, as well as its similarities to comparable forms of law in other Commonwealth jurisdictions. Part I examines the theoretical basis of quasi-constitutionality and its relationship to the Constitution. As a statutory and common law form of fundamental law, quasi-constitutionality is shown to signify an association with the Canadian Constitution and the foundational principles that underpin it. Part II proceeds to consider the special rules of interpretation applied to quasi-constitutional legislation, the basis of this interpretative approach, and the connection between the interpretation of similar provisions in quasi-constitutional legislation and the Constitution. As a statutory form of fundamental law, quasi-constitutional legislation is given a broad, liberal and purposive interpretation which significantly expands the rights which they protect. The theoretical basis of this approach is found in both the fundamental nature of the rights upheld by quasi-constitutional legislation as well as legislative intent. Part III explores how quasi-constitutional statutes affect the interpretation of regular legislation and how they are used for the purposes of judicial review. Quasi-constitutional legislation has a significant influence over regular statutes in the interpretative exercise, which in some instances results in conflicting statutes being declared inoperable. The basis of this form of judicial review is demonstrated to be rooted in statutory interpretation, and as such it provides an interesting model of rights protection and judicial review that is not conflated to constitutional and judicial supremacy.
Resumo:
Numerical techniques such as the Boundary Element Method, Finite Element Method and Finite Difference Time Domain have been used widely to investigate plane and curved wave-front scattering by rough surfaces. For certain shapes of roughness elements (cylinders, semi-cylinders and ellipsoids) there are semi-analytical alternatives. Here, we present a theory for multiple scattering by cylinders on a hard surface to investigate effects due to different roughness shape, the effects of vacancies and variation of roughness element size on the excess attenuation due to a periodically rough surfaces.
Resumo:
The objective of this work was to apply fuzzy majority multicriteria group decision?making to determine risk areas for foot?and?mouth disease (FMD) introduction along the border between Brazil and Paraguay. The study was conducted in three municipalities in the state of Mato Grosso do Sul, Brazil, located along the border with Paraguay. Four scenarios were built, applying the following linguistic quantifiers to describe risk factors: few, half, many, and most. The three criteria considered to be most likely to affect the vulnerability to introduction of FMD, according to experts? opinions, were: the introduction of animals in the farm, the distance from the border, and the type of property settlements. The resulting maps show a strong spatial heterogeneity in the risk of FMD introduction. The used methodology brings out a new approach that can be helpful to policy makers in the combat and eradication of FMD.
Resumo:
The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.
Resumo:
The objective of this thesis is the investigation of the Mode-I fracture mechanics parameters of quasi-brittle materials to shed light onto the influence of the width and size of the specimen on the fracture response of notched beams. To further the knowledge on the fracture process, 3D digital image correlation (DIC) was employed. A new method is proposed to determine experimentally the critical value of the crack opening, which is then used to determine the size of the fracture process zone (FPZ). In addition, the Mode-I fracture mechanics parameters are compared with the Mode-II interfacial properties of composites materials that feature as matrices the quasi-brittle materials studied in Mode-I conditions. To investigate the Mode II fracture parameters, single-lap direct shear tests are performed. Notched concrete beams with six cross-sections has been tested using a three-point bending (TPB) test set-up (Mode-I fracture mechanics). Two depths and three widths of the beam are considered. In addition to concrete beams, alkali-activated mortar beams (AAMs) that differ by the type and size of the aggregates have been tested using the same TPB set-up. Two dimensions of AAMs are considered. The load-deflection response obtained from DIC is compared with the load-deflection response obtained from the readings of two linear variable displacement transformers (LVDT). Load responses, peak loads, strain profiles along the ligament from DIC, fracture energy and failure modes of TPB tests are discussed. The Mode-II problem is investigated by testing steel reinforced grout (SRG) composites bonded to masonry and concrete elements under single-lap direct shear tests. Two types of anchorage systems are proposed for SRG reinforced masonry and concrete element to study their effectiveness. An indirect method is proposed to find the interfacial properties, compare them with the Mode-I fracture properties of the matrix and to model the effect of the anchorage.
Resumo:
This work aims to provide a theoretical examination of three recently created bodies of the United Nations mandated to investigate the alleged international crimes committed in Syria (IIIM), Iraq (UNITAD) and Myanmar (IIMM). Established as a compromise solution in the paralysis of international criminal jurisdictions, these essentially overlapping entities have been depicted as a ‘new generation’ of UN investigative mechanisms. While non-judicial in nature, they depart indeed from traditional commissions of inquiry in several respects due to their increased criminal or ‘quasi-prosecutorial’ character. After clarifying their legal basis and different mandating authorities, a comparative institutional analysis is thus carried out in order to ascertain whether these ‘mechanisms’ can be said to effectively represent a new institutional model. Through an in-depth assessment of their mandates, the thesis is also intended to outline both the strengths and the criticalities of these organs. Given their aim to facilitate criminal proceedings by sharing information and case files, it is suggested that more attention shall be paid to the position of the person under investigation. To this end, some proposals are made in order to enhance the mechanisms’ frameworks, especially from the angle of procedural safeguards. As a third aspect, the cooperation with judicial authorities is explored, in order to shed light on the actors involved, the relevant legal instruments and the possible obstacles, in particular from a human rights perspective. Ultimately, drawing from the detected issues, the thesis seeks to identify some lessons learned which could be taken into account in case of creation of new ad hoc investigative mechanisms or of a permanent institution of this kind.
Resumo:
The Deep Underground Neutrino Experiment (DUNE) is a long-baseline accelerator experiment designed to make a significant contribution to the study of neutrino oscillations with unprecedented sensitivity. The main goal of DUNE is the determination of the neutrino mass ordering and the leptonic CP violation phase, key parameters of the three-neutrino flavor mixing that have yet to be determined. An important component of the DUNE Near Detector complex is the System for on-Axis Neutrino Detection (SAND) apparatus, which will include GRAIN (GRanular Argon for Interactions of Neutrinos), a novel liquid Argon detector aimed at imaging neutrino interactions using only scintillation light. For this purpose, an innovative optical readout system based on Coded Aperture Masks is investigated. This dissertation aims to demonstrate the feasibility of reconstructing particle tracks and the topology of CCQE (Charged Current Quasi Elastic) neutrino events in GRAIN with such a technique. To this end, the development and implementation of a reconstruction algorithm based on Maximum Likelihood Expectation Maximization was carried out to directly obtain a three-dimensional distribution proportional to the energy deposited by charged particles crossing the LAr volume. This study includes the evaluation of the design of several camera configurations and the simulation of a multi-camera optical system in GRAIN.
Resumo:
An emerging technology, that Smart Radio Environments rely on to improve wireless link quality, are Reconfigurable Intelligent Surfaces (RISs). A RIS, in general, can be understood as a thin layer of EM composite material, typically mounted on the walls or ceilings of buildings, which can be reconfigured even after its deployment in the network. RISs made by composing artificial materials in an engineered way, in order to obtain unconventional characteristics, are called metasurfaces. Through the programming of the RIS, it is possible to control and/or modify the radio waves that affect it, thus shaping the radio environment. To overcome the limitations of RISs, the metaprism represents an alternative: it is a passive and non-reconfigurable frequency-selective metasurface that acts as a metamirror to improve the efficiency of the wireless link. In particular, using an OFDM (Orthogonal Frequency-Division Multiplexing) signaling it is possible to control the reflection of the signal, suitably selecting the sub-carrier assigned to each user, without having to interact with the metaprism or having to estimate the CSI. This thesis investigates how OFDM signaling and metaprism can be used for localization purposes, especially to extend the coverage area at low cost, in a scenario where the user is in NLoS (Non-line-of-sight) conditions with respect to the base station, both single antenna. In particular, the paper concerns the design of the analytical model and the corresponding Matlab implementation of a Maximum Likelihood (ML) estimator able to estimate the unknown position, behind an obstacle, from which a generic user transmits to a base station, exploiting the metaprism.
Resumo:
This is an ecological, analytical and retrospective study comprising the 645 municipalities in the State of São Paulo, the scope of which was to determine the relationship between socioeconomic, demographic variables and the model of care in relation to infant mortality rates in the period from 1998 to 2008. The ratio of average annual change for each indicator per stratum coverage was calculated. Infant mortality was analyzed according to the model for repeated measures over time, adjusted for the following correction variables: the city's population, proportion of Family Health Programs (PSFs) deployed, proportion of Growth Acceleration Programs (PACs) deployed, per capita GDP and SPSRI (São Paulo social responsibility index). The analysis was performed by generalized linear models, considering the gamma distribution. Multiple comparisons were performed with the likelihood ratio with chi-square approximate distribution, considering a significance level of 5%. There was a decrease in infant mortality over the years (p < 0.05), with no significant difference from 2004 to 2008 (p > 0.05). The proportion of PSFs deployed (p < 0.0001) and per capita GDP (p < 0.0001) were significant in the model. The decline of infant mortality in this period was influenced by the growth of per capita GDP and PSFs.