904 resultados para Violation of Principle of Non Refoulement
Resumo:
A substantial research literature exists regarding the psychopathy construct in forensic populations, but more recently, the construct has been extended to non-clinical populations. The purpose of the present dissertation was to investigate the content and the correlates of the psychopathy construct, with a particular focus on addressing gaps and controversies in the literature. In Study 1, the role of low anxiety in psychopathy was investigated, as some authors have proposed that low anxiety is integral to the psychopathy construct. Participants (n = 346) responded to two self-report psychopathy scales, the SRP-III and the PPI-R, as well as measures of temperament, personality, and antisociality. Of particular interest was the PPI-R Stress Immunity sub scale, which represents low anxiety content. I t was found that Stress Immunity was not correlated with SRP-III psychopathy, nor did it share common personality or temperament correlates or contribute to the prediction of anti sociality. From Study 1, it was concluded that it was unlikely that low anxiety is a central feature of the psychopathy construct. In Study 2, the relationship between SRP-III psychopathy and Ability Emotional Intelligence (Le., Emotional Intelligence measured as an ability, rather than as a self-report personality trait-like characteristic) was investigated, to determine whether psychopathy is be s t seen as a syndrome characterized by emotional deficits or by the ability to skillfully manipulate and prey upon the others' emotions. A negative correlation between the two constructs was found, suggesting that psychopathy is best characterized by deficits in perceiving, facilitating, managing, and understanding emotions. In Study 3, sex differences in the sexual behavior (i.e., promiscuity, age of first sexual behaviors, extradyadic sexual relations) and appearance-related esteem (i.e., body shame,appearance anxiety, self-esteem) correlates of SRP-III psychopathy were investigated. The sexual behavior correlates of psychopathy were quite similar for men and women, but the esteem correlates were very different, such that high psychopathy in men was related to high esteem, whereas high psychopathy in women was generally related to low esteem. This sex difference was difficult to interpret in that it was not mediated by sexual behavior, suggesting that further exploration of this topic is warranted. Together, these three studies contribute to our understanding of non-clinical psychopathy, indicating that low anxiety is likely not part of the construct, that psychopathy is related to low levels of ability in Emotional Intelligence, and that psychopathy is an important predictor of behavior, ability, and beliefs and feelings about the self
Resumo:
The energy of a graph G is the sum of the absolute values of its eigenvalues. In this paper, we study the energies of some classes of non-regular graphs. Also the spectrum of some non-regular graphs and their complements are discussed.
Resumo:
Non-destructive testing (NDT) is the use of non-invasive techniques to determine the integrity of a material, component, or structure. Engineers and scientists use NDT in a variety of applications, including medical imaging, materials analysis, and process control.Photothermal beam deflection technique is one of the most promising NDT technologies. Tremendous R&D effort has been made for improving the efficiency and simplicity of this technique. It is a popular technique because it can probe surfaces irrespective of the size of the sample and its surroundings. This technique has been used to characterize several semiconductor materials, because of its non-destructive and non-contact evaluation strategy. Its application further extends to analysis of wide variety of materials. Instrumentation of a NDT technique is very crucial for any material analysis. Chapter two explores the various excitation sources, source modulation techniques, detection and signal processing schemes currently practised. The features of the experimental arrangement including the steps for alignment, automation, data acquisition and data analysis are explained giving due importance to details.Theoretical studies form the backbone of photothermal techniques. The outcome of a theoretical work is the foundation of an application.The reliability of the theoretical model developed and used is proven from the studies done on crystalline.The technique is applied for analysis of transport properties such as thermal diffusivity, mobility, surface recombination velocity and minority carrier life time of the material and thermal imaging of solar cell absorber layer materials like CuInS2, CuInSe2 and SnS thin films.analysis of In2S3 thin films, which are used as buffer layer material in solar cells. The various influences of film composition, chlorine and silver incorporation in this material is brought out from the measurement of transport properties and analysis of sub band gap levels.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention. Chapter six thus elucidates the theoretical aspects of application of photothermal techniques for solar cell analysis. The experimental design and method for determination of solar cell efficiency, optimum load resistance and series resistance with results from the analysis of CuInS2/In2S3 based solar cell forms the skeleton of this chapter.
Resumo:
The metals present in the surface sediments have high demand on a global perspective, and the main reservoir of these elements is believed to be the ocean floor. A lot of studies on metals are going on throughout the world for its quantification and exploitation. Even though, some preliminary attempts have been made in selected areas for the quantitative study of metals in the western continental shelf of India, no comprehensive work has been reported so far. The importance of this study also lies on the fact that there has not been a proper evaluation of the impact of the Great Tsunami of 2004 on the coastal areas of the south India. In View of this, an attempt has been made to address the seasonal distribution, behavior and mechanisms which control the deposition of metals in the sediments of the western continental shelf and Cochin Estuary, an annex to this coastal marine region.Surface sediment samples were collected seasonally from two subenvironemnts of southwest coast of India, (continental shelf of Kerala and Cochin estuarine system), to estimate the seasonal distribution and geochemical behavior of non-transition, transition, rare-earth elements, Th and U. Bottom water samples were also taken from each station, and analysed for temperature, salinity and dissolved oxygen, hence the response of redox sensitive elements to oxygen minimum zone can be addressed. In addition, other sedimentary parameters such as sand, silt, clay fractions, CaCO3 and organic carbon content were also estimated to evaluate the control factors on level of metals present in the sediment. The study used different environmental data analysis techniques to evaluate the distribution and behavior of elements during different seasons. This includes environmental parameters such as elemental normalisation, enrichment factor, element excess, cerium and europium anomalies and authigenic uranium.
Resumo:
The thesis mainly focuses on material characterization in different environments: freely available samples taken in planar fonn, biological samples available in small quantities and buried objects.Free space method, finds many applications in the fields of industry, medicine and communication. As it is a non-contact method, it can be employed for monitoring the electrical properties of materials moving through a conveyor belt in real time. Also, measurement on such systems at high temperature is possible. NID theory can be applied to the characterization of thin films. Dielectric properties of thin films deposited on any dielectric substrate can be determined. ln chemical industry, the stages of a chemical reaction can be monitored online. Online monitoring will be more efficient as it saves time and avoids risk of sample collection.Dielectric contrast is one of the main factors, which decides the detectability of a system. lt could be noted that the two dielectric objects of same dielectric constant 3.2 (s, of plastic mine) placed in a medium of dielectric constant 2.56 (er of sand) could even be detected employing the time domain analysis of the reflected signal. This type of detection finds strategic importance as it provides solution to the problem of clearance of non-metallic mines. The demining of these mines using the conventional techniques had been proved futile. The studies on the detection of voids and leakage in pipes find many applications.The determined electrical properties of tissues can be used for numerical modeling of cells, microwave imaging, SAR test etc. All these techniques need the accurate determination of dielectric constant. ln the modem world, the use of cellular and other wireless communication systems is booming up. At the same time people are concemed about the hazardous effects of microwaves on living cells. The effect is usually studied on human phantom models. The construction of the models requires the knowledge of the dielectric parameters of the various body tissues. lt is in this context that the present study gains significance. The case study on biological samples shows that the properties of normal and infected body tissues are different. Even though the change in the dielectric properties of infected samples from that of normal one may not be a clear evidence of an ailment, it is an indication of some disorder.ln medical field, the free space method may be adapted for imaging the biological samples. This method can also be used in wireless technology. Evaluation of electrical properties and attenuation of obstacles in the path of RF waves can be done using free waves. An intelligent system for controlling the power output or frequency depending on the feed back values of the attenuation may be developed.The simulation employed in GPR can be extended for the exploration of the effects due to the factors such as the different proportion of water content in the soil, the level and roughness of the soil etc on the reflected signal. This may find applications in geological explorations. ln the detection of mines, a state-of-the art technique for scanning and imaging an active mine field can be developed using GPR. The probing antenna can be attached to a robotic arm capable of three degrees of rotation and the whole detecting system can be housed in a military vehicle. In industry, a system based on the GPR principle can be developed for monitoring liquid or gas through a pipe, as pipe with and without the sample gives different reflection responses. lt may also be implemented for the online monitoring of different stages of extraction and purification of crude petroleum in a plant.Since biological samples show fluctuation in the dielectric nature with time and other physiological conditions, more investigation in this direction should be done. The infected cells at various stages of advancement and the normal cells should be analysed. The results from these comparative studies can be utilized for the detection of the onset of such diseases. Studying the properties of infected tissues at different stages, the threshold of detectability of infected cells can be determined.
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.
Resumo:
In this article I provide a critical account of the 'placing' of England's M1 motor-way. I start by critiquing Marc Auge's anthropological writings on 'non-places' which have provided a common point of reference for academics discussing spaces of travel, consumption and exchange in the contemporary world. I argue that Auge's ethnology of supermodernity results in a rather partial account of these sites, that he overstates the novelty of contemporary experiences of these spaces, and that he fails to acknowledge the heterogeneity and materiality of the social networks bound up with the production of non-places/places. I suggest that, rather than focusing on the presences and absences associated with the polarities of place and non-place, academics should examine the multiple, partial, dynamic and relational 'placings' which arise through the diverse performances and movements associated with travel, consumption and exchange. I then trace the topologies of England's M1 motorway, examining some of the different ways in which the motorway has been assembled, performed and placed over the past 45 years.
Resumo:
Encapsulated cocoa (Theobroma cacao L.) somatic embryos subjected to 0.08-1.25 M sucrose treatments were analyzed for embryo soluble sugar content, non-freezable water content, moisture level after desiccation and viability after desiccation and freezing. Results indicated that the higher the sucrose concentration in the treatment medium, the greater was the extent of sucrose accumulation in the embryos. Sucrose treatment greatly assisted embryo post-desiccation recovery since only 40% of the control embryos survived desiccation, whereas a survival rate of 60-95% was recorded for embryos exposed to 0.5-1.25 M sucrose. The non-freezable water content of the embryos was estimated at between 0.26 and 0.61 g H2O g(-1)dw depending on the sucrose treatment, and no obvious relationship could be found between the endogenous sucrose level and the amount of non-freezable water in the embryos. Cocoa somatic embryos could withstand the loss of a fraction of their non-freezable water without losing viability following desiccation. Nevertheless, the complete removal of potentially freezable water was not sufficient for most embryos to survive freezing.
Resumo:
Quality control on fruits requires reliable methods, able to assess with reasonable accuracy and possibly in a non-destructive way their physical and chemical characteristics. More specifically, a decreased firmness indicates the presence of damage or defects in the fruit or else that the fruit has exceeded its “best before date”, becoming unsuitable for consumption. In high-value exotic fruits, such as mangoes, where firmness cannot be easily measured from a simple observation of texture, colour changes and unevenness of fruits surface, the use of non-destructive techniques is highly recommendable. In particular, the application of Laser vibrometry, based on the Doppler effect, a non-contact technique sensitive to differences in displacements inferior to the nanometre, appears ideal for a possible on-line control on food. Previous results indicated that a phase shift can be in a repeatable way associated with the presence of damage on the fruit, whilst a decreased firmness results in significant differences in the displacement of the fruits under the same excitation signal. In this work, frequency ranges for quality control via the application of a sound chirp are suggested, based on the measurement of the signal coherence. The variations of the average vibration spectrum of a grid of points, or of point-by-point signal velocity allows the go-no go recognition of “firm” and “over-ripe” fruits, with notable success in the particular case of mangoes. The future exploitation of this work will include the application of this method to allow on-line control during conveyor belt distribution of fruits.
Resumo:
Background: High rates of co-morbidity between Generalized Social Phobia (GSP) and Generalized Anxiety Disorder (GAD) have been documented. The reason for this is unclear. Family studies are one means of clarifying the nature of co-morbidity between two disorders. Methods: Six models of co-morbidity between GSP and GAD were investigated in a family aggregation study of 403 first-degree relatives of non-clinical probands: 37 with GSP, 22 with GAD, 15 with co-morbid GSP/GAD, and 41 controls with no history of GSP or GAD. Psychiatric data were collected for probands and relatives. Mixed methods (direct and family history interviews) were utilised. Results: Primary contrasts (against controls) found an increased rate of pure GSP in the relatives of both GSP probands and co-morbid GSP/GAD probands, and found relatives of co-morbid GSP/GAD probands to have an increased rate of both pure GAD and comorbid GSP/GAD. Secondary contrasts found (i) increased GSP in the relatives of GSP only probands compared to the relatives of GAD only probands; and (ii) increased GAD in the relatives of co-morbid GSP/GAD probands compared to the relatives of GSP only probands. Limitations: The study did not directly interview all relatives, although the reliability of family history data was assessed. The study was based on an all-female proband sample. The implications of both these limitations are discussed. Conclusions: The results were most consistent with a co-morbidity model indicating independent familial transmission of GSP and GAD. This has clinical implications for the treatment of patients with both disorders. (C) 2006 Elsevier B.V. All fights reserved.
Resumo:
Studies on exposure of non-targets to anticoagulant rodenticides have largely focussed on predatory birds and mammals; insectivores have rarely been studied. We investigated the exposure of 120 European hedgehogs (Erinaceus europaeus) from throughout Britain to first- and second-generation anticoagulant rodenticides (FGARs and SGARs) using high performance liquid chromatography coupled with fluorescence detection (HPLC) and liquid-chromatography mass spectrometry (LCMS). The proportion of hedgehogs with liver SGAR concentrations detected by HPLC was 3-13% per compound, 23% overall. LCMS identified much higher prevalence for difenacoum and bromadiolone, mainly because of greater ability to detect low level contamination. The overall proportion of hedgehogs with LCMS-detected residues was 57.5% (SGARs alone) and 66.7% (FGARs and SGARs combined); 27 (22.5%) hedgehogs contained >1 rodenticide. Exposure of insectivores and predators to anticoagulant rodenticides appears to be similar. The greater sensitivity of LCMS suggests that hitherto exposure of non-targets is likely to have been under-estimated using HPLC techniques.
Resumo:
The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.
Resumo:
We report preliminary results from studies of biological effects induced by non-thermal levels of non-ionizing electromagnetic radiation. Exponentially growing Saccharomyces cerevisiae yeast cells grown on dry media were exposed to electromagnetic fields in the 200–350 GHz frequency range at low power density to observe possible non-thermal effects on the microcolony growth. Exposure to the electromagnetic field was conducted over 2.5 h. The data from exposure and control experiments were grouped into either large-, medium- or small-sized microcolonies to assist in the accurate assessment of growth. The three groups showed significant differences in growth between exposed and control microcolonies. A statistically significant enhanced growth rate was observed at 341 GHz. Growth rate was assessed every 30 min via time-lapse photography. Possible interaction mechanisms are discussed, taking into account Frohlich's hypothesis.
Resumo:
The principle aim of this research is to elucidate the factors driving the total rate of return of non-listed funds using a panel data analytical framework. In line with previous results, we find that core funds exhibit lower yet more stable returns than value-added and, in particular, opportunistic funds, both cross-sectionally and over time. After taking into account overall market exposure, as measured by weighted market returns, the excess returns of value-added and opportunity funds are likely to stem from: high leverage, high exposure to development, active asset management and investment in specialized property sectors. A random effects estimation of the panel data model largely confirms the findings obtained from the fixed effects model. Again, the country and sector property effect shows the strongest significance in explaining total returns. The stock market variable is negative which hints at switching effects between competing asset classes. For opportunity funds, on average, the returns attributable to gearing are three times higher than those for value added funds and over five times higher than for core funds. Overall, there is relatively strong evidence indicating that country and sector allocation, style, gearing and fund size combinations impact on the performance of unlisted real estate funds.