996 resultados para Negative dimension integration
Resumo:
In this paper, the expression “neighbourhood policy” of the European Union (EU) is understood in a broad way which includes the members of the European Free Trade Association (EFTA) contracting parties to the European Economic Area (EEA), the EFTA State Switzerland, candidate states, the countries of the European Neighbour-hood Policy (ENP), and Russia. The European Court of Justice (ECJ) is the centre of gravity in the judicial dimension of this policy. The innermost circle of integration after the EU itself comprises the EFTA States who are party to the European Economic Area. With the EFTA Court, they have their own common court. The existence of two courts – the ECJ and the EFTA Court – raises the question of homogeneity of the case law. The EEA homogeneity rules resemble the ones of the Lugano Convention. The EFTA Court is basically obliged to follow or take into account relevant ECJ case law. But even if the ECJ has gone first, there may be constellations where the EFTA Court comes to the conclusion that it must go its own way. Such constellations may be given if there is new scientific evidence, if the ECJ has left certain questions open, where there is relevant case law of the European Court of Human Rights or where, in light of the specific circumstances of the case, there is room for “creative homogeneity”. However, in the majority of its cases the EFTA Court is faced with novel legal questions. In such cases, the ECJ, its Advocates General and the Court of First Instance make reference to the EFTA Court’s case law. The question may be posed whether the EEA could serve as a model for other regional associations. For the ENP states, candidate States and Russia this is hard to imagine. Their courts will to varying degrees look to the ECJ when giving interpretation to the relevant agreements. The Swiss Government is – at least for the time being – unwilling to make a second attempt to join the EEA. The European Commission has therefore proposed to the Swiss to dock their sectoral agreements with the EU to the institutions of the EFTA pillar, the EFTA Surveillance Authority (ESA) and the EFTA Court. Switzerland would then negotiate the right to nominate a member of the ESA College and of the EFTA Court. The Swiss Government has, however, opted for another model. Swiss courts would continue to look to the ECJ, as they did in the past, and conflicts should also in the future be resolved by diplomatic means. But the ECJ would play a decisive role in dispute settlement. It would, upon unilateral request of one side, give an “authoritative” interpretation of EU law as incorporated into the relevant bilateral agreement. In a “Non-Paper” which was drafted by the chief negotiators, the interpretations of the ECJ are even characterised as binding. The decision-making power would, however, remain with the Joint Committees where Switzerland could say no. The Swiss Government assumes that after a negative decision by the ECJ it would be able to negotiate a compromise solution with the Commission without the ECJ being able to express itself on the outcome. The Government has therefore not tried to emphasise that the ECJ would not be a foreign court. Whether the ECJ would accept its intended role, is an open question. And if it would, the Swiss Government would have to explain to its voters that Switzerland retains the freedom to disregard such a binding decision and that for this reason the ECJ is not only no foreign court, but no adjudicating court at all.
Resumo:
The social dimension of the internal market or of the EU more generally has recently been under quite fundamental attack. Calls for 'Europe' to be 'more social' have been heard repeatedly. Witness the polarized debates about the services directive, the anxieties concerning several ECJ cases about what limitations of the free movement of workers (posted or not) are justified or the assertion of a 'neo-liberal agenda' in Brussels disregarding or eroding the social dimension. This BEEP Briefing paper takes an analytical approach to these issues and to the possible 'framing' involved. Such an analysis reveals a very different picture than the negative framing in such debates has it: there is nothing particular 'a-social' about the internal market or the EU at large. This overall conclusion is reached following five steps. First, several 'preliminaries' of the social dimension have to be kept in mind (including the two-tier regulatory & expenditure structure of what is too loosely called 'social Europe' ) and this is only too rarely done or at best in partial, hence misleading, ways. Second, the social acquis at EU and Member States' levels is spelled out, broken down into four aspects (social spending; labour market regulation; industrial relations; free movements & establishment). Assessing the EU acquis in the light of the two levels of powers shows clearly that it is the combination of the two levels which matters. Member States and e.g. labour unions do not want the EU level to become deeply involved ( with some exceptions) and the actual impact of free movement and establishment is throttled by far-reaching host-country control and the requirement of a 'high level of social protection' in the treaty. Third, six anxieties about the social dimension of the internal market are discussed and few arguments are found which are attributable to the EU or its weakening social dimension. Fourth, another six anxieties are discussed emerging from the socio-economic context of the social dimension of the EU at large. The analysis demonstrates that, even if these anxieties ought to be taken serious, the EU is hardly or not the culprit. Fifth, all this is complemented by a number of other facts or arguments strengthening the case that the EU social dimension is fine.
Resumo:
"April 2000."
Resumo:
"June 2000."
Resumo:
The focus of this study is on the governance decisions in a concurrent channels context, in the case of uncertainty. The study examines how a firm chooses to deploy its sales force in times of uncertainty, and the subsequent performance outcome of those deployment choices. The theoretical framework is based on multiple theories of governance, including transaction cost analysis (TCA), agency theory, and institutional economics. Three uncertainty variables are investigated in this study. The first two are demand and competitive uncertainty which are considered to be industry-level market uncertainty forms. The third uncertainty, political uncertainty, is chosen as it is an important dimension of institutional environments, capturing non-economic circumstances such as regulations and political systemic issues. The study employs longitudinal secondary data from a Thai hotel chain, comprising monthly observations from January 2007 – December 2012. This hotel chain has its operations in 4 countries, Thailand, the Philippines, United Arab Emirates – Dubai, and Egypt, all of which experienced substantial demand, competitive, and political uncertainty during the study period. This makes them ideal contexts for this study. Two econometric models, both deploying Newey-West estimations, are employed to test 13 hypotheses. The first model considers the relationship between uncertainty and governance. The second model is a version of Newey-West, using an Instrumental Variables (IV) estimator and a Two-Stage Least Squares model (2SLS), to test the direct effect of uncertainty on performance and the moderating effect of governance on the relationship between uncertainty and performance. The observed relationship between uncertainty and governance observed follows a core prediction of TCA; that vertical integration is the preferred choice of governance when uncertainty rises. As for the subsequent performance outcomes, the results corroborate that uncertainty has a negative effect on performance. Importantly, the findings show that becoming more vertically integrated cannot help moderate the effect of demand and competitive uncertainty, but can significantly moderate the effect of political uncertainty. These findings have significant theoretical and practical implications, and extend our knowledge of the impact on uncertainty significantly, as well as bringing an institutional perspective to TCA. Further, they offer managers novel insight into the nature of different types of uncertainty, their impact on performance, and how channel decisions can mitigate these impacts.
Resumo:
Natural and man-made disasters have gained attention at all levels of policy-making in recent years. Emergency management tasks are inherently complex and unpredictable, and often require coordination among multiple organizations across different levels and locations. Effectively managing various knowledge areas and the organizations involved has become a critical emergency management success factor. However, there is a general lack of understanding about how to describe and assess the complex nature of emergency management tasks and how knowledge integration can help managers improve emergency management task performance. ^ The purpose of this exploratory research was first, to understand how emergency management operations are impacted by tasks that are complex and inter-organizational and second, to investigate how knowledge integration as a particular knowledge management strategy can improve the efficiency and effectiveness of the emergency tasks. Three types of specific knowledge were considered: context-specific, technology-specific, and context-and-technology-specific. ^ The research setting was the Miami-Dade Emergency Operations Center (EOC) and the study was based on the survey responses from the participants in past EOC activations related to their emergency tasks and knowledge areas. The data included task attributes related to complexity, knowledge area, knowledge integration, specificity of knowledge, and task performance. The data was analyzed using multiple linear regressions and path analyses, to (1) examine the relationships between task complexity, knowledge integration, and performance, (2) the moderating effects of each type of specific knowledge on the relationship between task complexity and performance, and (3) the mediating role of knowledge integration. ^ As per theory-based propositions, the results indicated that overall component complexity and interactive complexity tend to have a negative effect on task performance. But surprisingly, procedural rigidity tended to have a positive effect on performance in emergency management tasks. Also as per our expectation, knowledge integration had a positive relationship with task performance. Interestingly, the moderating effects of each type of specific knowledge on the relationship between task complexity and performance were varied and the extent of mediation of knowledge integration depended on the dimension of task complexity. ^
Resumo:
Natural and man-made disasters have gained attention at all levels of policy-making in recent years. Emergency management tasks are inherently complex and unpredictable, and often require coordination among multiple organizations across different levels and locations. Effectively managing various knowledge areas and the organizations involved has become a critical emergency management success factor. However, there is a general lack of understanding about how to describe and assess the complex nature of emergency management tasks and how knowledge integration can help managers improve emergency management task performance. The purpose of this exploratory research was first, to understand how emergency management operations are impacted by tasks that are complex and inter-organizational and second, to investigate how knowledge integration as a particular knowledge management strategy can improve the efficiency and effectiveness of the emergency tasks. Three types of specific knowledge were considered: context-specific, technology-specific, and context-and-technology-specific. The research setting was the Miami-Dade Emergency Operations Center (EOC) and the study was based on the survey responses from the participants in past EOC activations related to their emergency tasks and knowledge areas. The data included task attributes related to complexity, knowledge area, knowledge integration, specificity of knowledge, and task performance. The data was analyzed using multiple linear regressions and path analyses, to (1) examine the relationships between task complexity, knowledge integration, and performance, (2) the moderating effects of each type of specific knowledge on the relationship between task complexity and performance, and (3) the mediating role of knowledge integration. As per theory-based propositions, the results indicated that overall component complexity and interactive complexity tend to have a negative effect on task performance. But surprisingly, procedural rigidity tended to have a positive effect on performance in emergency management tasks. Also as per our expectation, knowledge integration had a positive relationship with task performance. Interestingly, the moderating effects of each type of specific knowledge on the relationship between task complexity and performance were varied and the extent of mediation of knowledge integration depended on the dimension of task complexity.
Resumo:
Sickle-cell disease is the most prevalent genetic disease in the Brazilian population. Lower limb ulcers are the most frequent cutaneous complications, affecting 8% to 10% of the patients. These ulcers are usually deep and may take many years to heal. Evidence about the effectiveness of systemic or topical treatment of these wounds is limited, apart from stabilization of the anemia. A 28-year old woman with sickle-cell disease was admitted for treatment of three deep chronic lower leg ulcers. All wounds had tendon exposure and contained firmly adherent fibrin slough. Following surgical debridement and before grafting, the wounds were managed with three different dressings: a rayon and normal saline solution dressing, a calcium alginate dressing covered with gauze, and negative pressure therapy. All three wounds healed successfully and their grafts showed complete integration; only the rayon-dressed wound required a second debridement. The alginate and rayon-dressed wounds recurred after 9 months and required additional skin grafts. Helpful research on managing ulcers in patients with sickle-cell disease is minimal, but the results of this case study suggest that topical treatment modalities may affect outcomes. Research to explore the safety and effectiveness of NPT in patients with sickle-cell wounds is warranted.
Resumo:
is study examined the social adaptation of children with mild intellectual disability who were either (a) partially integrated into regular primary school classes, or (b) full-time in separate classes, All of the children were integrated in sport and play activities with the whole school. Consistent with previous research, children with intellectual disability were less socially accepted than were a matched group of control children. Children in partially integrated classes received more play nominations than those in separate classes, brit there was no greater acceptance as a best friend. On teachers' reports, disabled children had higher levels of inappropriate social behaviours, but there was no significant difference in appropriate behaviours. Self-assessments by integrated children were more negative than those by children in separate classes, and their peer-relationship satisfaction was lower. Ratings by disabled children of their satisfaction with peer relationships were associated with ratings of appropriate social skills by themselves and their teachers, and with self-ratings of negative behaviour. The study confirmed that partial integration can have negative consequences for children with an intellectual disability.
Resumo:
A new operationalmatrix of fractional integration of arbitrary order for generalized Laguerre polynomials is derived.The fractional integration is described in the Riemann-Liouville sense.This operational matrix is applied together with generalized Laguerre tau method for solving general linearmultitermfractional differential equations (FDEs).Themethod has the advantage of obtaining the solution in terms of the generalized Laguerre parameter. In addition, only a small dimension of generalized Laguerre operational matrix is needed to obtain a satisfactory result. Illustrative examples reveal that the proposedmethod is very effective and convenient for linear multiterm FDEs on a semi-infinite interval.
Resumo:
This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported.
Resumo:
In the last few years, many researchers have studied the presence of common dimensions of temperament in subjects with symptoms of anxiety. The aim of this study is to examine the association between temperamental dimensions (high negative affect and activity level) and anxiety problems in clinicalpreschool children. A total of 38 children, ages 3 to 6 years, from the Infant and Adolescent Mental Health Center of Girona and the Center of Diagnosis and Early Attention of Sabadell and Olot were evaluated by parents and psychologists. Their parents completed several screening scales and, subsequently, clinical child psychopathology professionals carried out diagnostic interviews with children from the sample who presented signs of anxiety. Findings showed that children with high levels of negative affect and low activity level have pronounced symptoms of anxiety. However, children with anxiety disorders do not present different temperament styles from their peers without these pathologies