992 resultados para Spectral Gap Problems
Resumo:
Double degree
Resumo:
In-Band Full-DupleX (IB-FDX) is defined as the ability for nodes to transmit and receive signals simultaneously on the same channel. Conventional digital wireless networks do not implement it, since a node’s own transmission signal causes interference to the signal it is trying to receive. However, recent studies attempt to overcome this obstacle, since it can potentially double the spectral efficiency of current wireless networks. Different mechanisms exist today that are able to reduce a significant part of the Self- Interference (SI), although specially tuned Medium Access Control (MAC) protocols are required to optimize its use. One of IB-FDX’s biggest problems is that the nodes’ interference range is extended, meaning the unusable space for other transmissions and receptions is broader. This dissertation proposes using MultiPacket Reception (MPR) to address this issue and adapts an already existing Single-Carrier with Frequency-Domain Equalization (SC-FDE) receiver to IB-FDX. The performance analysis suggests that MPR and IB-FDX have a strong synergy and are able to achieve higher data rates, when used together. Using analytical models, the optimal transmission patterns and transmission power were identified, which maximize the channel capacity with the minimal energy consumption. This was used to define a new MAC protocol, named Full-duplex Multipacket reception Medium Access Control (FM-MAC). FM-MAC was designed for a single-hop cellular infrastructure, where the Access Point (AP) and the terminals implement both IB-FDX and MPR. It divides the coverage range of the AP into a closer Full-DupleX (FDX) zone and a farther Half-DupleX (HDX) zone and adds a tunable fairness mechanism to avoid terminal starvation. Simulation results show that this protocol provides efficient support for both HDX and FDX terminals, maximizing its capacity when more FDX terminals are used.
Resumo:
Increasing disparity between executive compensation and that of the average worker (the pay gap) has generated a fierce debate about its causes and effects. This paper studies the determinants and performance effects of the pay gap through the prism of Tournament Incentives and the Equity Fairness Theory. Results show that the size of the pay gap is caused primarily by the size of the firm and by the standards of its industry and also by the unionization rate and whether the Chairman is also the CEO. The paper Concludes by showins that the pay gap has a positive effect on firm performance in the United States Keywords:
Resumo:
According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.
Resumo:
The Electrohysterogram (EHG) is a new instrument for pregnancy monitoring. It measures the uterine muscle electrical signal, which is closely related with uterine contractions. The EHG is described as a viable alternative and a more precise instrument than the currently most widely used method for the description of uterine contractions: the external tocogram. The EHG has also been indicated as a promising tool in the assessment of preterm delivery risk. This work intends to contribute towards the EHG characterization through the inventory of its components which are: • Contractions; • Labor contractions; • Alvarez waves; • Fetal movements; • Long Duration Low Frequency Waves; The instruments used for cataloging were: Spectral Analysis, parametric and non-parametric, energy estimators, time-frequency methods and the tocogram annotated by expert physicians. The EHG and respective tocograms were obtained from the Icelandic 16-electrode Electrohysterogram Database. 288 components were classified. There is not a component database of this type available for consultation. The spectral analysis module and power estimation was added to Uterine Explorer, an EHG analysis software developed in FCT-UNL. The importance of this component database is related to the need to improve the understanding of the EHG which is a relatively complex signal, as well as contributing towards the detection of preterm birth. Preterm birth accounts for 10% of all births and is one of the most relevant obstetric conditions. Despite the technological and scientific advances in perinatal medicine, in developed countries, prematurity is the major cause of neonatal death. Although various risk factors such as previous preterm births, infection, uterine malformations, multiple gestation and short uterine cervix in second trimester, have been associated with this condition, its etiology remains unknown [1][2][3].
Resumo:
\The idea that social processes develop in a cyclical manner is somewhat like a `Lorelei'. Researchers are lured to it because of its theoretical promise, only to become entangled in (if not wrecked by) messy problems of empirical inference. The reasoning leading to hypotheses of some kind of cycle is often elegant enough, yet the data from repeated observations rarely display the supposed cyclical pattern. (...) In addition, various `schools' seem to exist which frequently arrive at di erent conclusions on the basis of the same data." (van der Eijk and Weber 1987:271). Much of the empirical controversies around these issues arise because of three distinct problems: the coexistence of cycles of di erent periodicities, the possibility of transient cycles and the existence of cycles without xed periodicity. In some cases, there are no reasons to expect any of these phenomena to be relevant. Seasonality caused by Christmas is one such example (Wen 2002). In such cases, researchers mostly rely on spectral analysis and Auto-Regressive Moving-Average (ARMA) models to estimate the periodicity of cycles.1 However, and this is particularly true in social sciences, sometimes there are good theoretical reasons to expect irregular cycles. In such cases, \the identi cation of periodic movement in something like the vote is a daunting task all by itself. When a pendulum swings with an irregular beat (frequency), and the extent of the swing (amplitude) is not constant, mathematical functions like sine-waves are of no use."(Lebo and Norpoth 2007:73) In the past, this di culty has led to two di erent approaches. On the one hand, some researchers dismissed these methods altogether, relying on informal alternatives that do not meet rigorous standards of statistical inference. Goldstein (1985 and 1988), studying the severity of Great power wars is one such example. On the other hand, there are authors who transfer the assumptions of spectral analysis (and ARMA models) into fundamental assumptions about the nature of social phenomena. This type of argument was produced by Beck (1991) who, in a reply to Goldstein (1988), claimed that only \ xed period models are meaningful models of cyclic phenomena".We argue that wavelet analysis|a mathematical framework developed in the mid-1980s (Grossman and Morlet 1984; Goupillaud et al. 1984) | is a very viable alternative to study cycles in political time-series. It has the advantage of staying close to the frequency domain approach of spectral analysis while addressing its main limitations. Its principal contribution comes from estimating the spectral characteristics of a time-series as a function of time, thus revealing how its di erent periodic components may change over time. The rest of article proceeds as follows. In the section \Time-frequency Analysis", we study in some detail the continuous wavelet transform and compare its time-frequency properties with the more standard tool for that purpose, the windowed Fourier transform. In the section \The British Political Pendulum", we apply wavelet analysis to essentially the same data analyzed by Lebo and Norpoth (2007) and Merrill, Grofman and Brunell (2011) and try to provide a more nuanced answer to the same question discussed by these authors: do British electoral politics exhibit cycles? Finally, in the last section, we present a concise list of future directions.
Resumo:
This paper presents a critical and quantitative analysis of the influence of the Power Quality in grid connected solar photovoltaic microgeneration installations. First are introduced the main regulations and legislation related with the solar photovoltaic microgeneration, in Portugal and Europe. Next are presented Power Quality monitoring results obtained from two residential solar photovoltaic installations located in the north of Portugal, and is explained how the Power Quality events affect the operation of these installations. Afterwards, it is described a methodology to estimate the energy production losses and the impact in the revenue caused by the abnormal operation of the electrical installation. This is done by comparing the amount of energy that was injected into the power grid with the theoretical value of energy that could be injected in normal conditions. The performed analysis shows that Power Quality severally affects the solar photovoltaic installations operation. The losses of revenue in the two monitored installations M1 and M2 are estimated in about 27% and 22%, respectively.
Resumo:
During recent decades it has been possible to identify several problems in construction industry project management, related with to systematic failures in terms of fulfilling its schedule, cost and quality targets, which highlight a need for an evaluation of the factors that may cause these failures. Therefore, it is important to understand how project managers plan the projects, so that the performance and the results can be improved. However, it is important to understand if other areas beyond cost and time management that are mentioned on several studies as the most critical areas, receive the necessary attention from construction project managers. Despite the cost and time are the most sensitive areas/fields, there are several other factors that may lead to project failure. This study aims at understand the reasons that may cause the deviation in terms of cost, time and quality, from the project management point of view, looking at the knowledge areas mentioned by PMI (Project Management Institute).
Resumo:
The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.
Resumo:
Firefly Algorithm is a recent swarm intelligence method, inspired by the social behavior of fireflies, based on their flashing and attraction characteristics [1, 2]. In this paper, we analyze the implementation of a dynamic penalty approach combined with the Firefly algorithm for solving constrained global optimization problems. In order to assess the applicability and performance of the proposed method, some benchmark problems from engineering design optimization are considered.
Resumo:
Dissertação de mestrado em Economia Monetária, Bancária e Financeira
Resumo:
In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.
Resumo:
Tese de Doutoramento em Psicologia Básica