942 resultados para CALL


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thermally driven Structural phase transition in the organic-inorganic hybrid perovskite (CnH2n+1NH3)(2)PbI4 has been investigated using molecular dynamics (MD) simulations. This system consists of positively charged alkyl-amine chains anchored to a rigid negatively charged PbI4 sheet with the chains organized as bilayers with a herringbone arrangement. Atomistic simulations were performed using ail isothermal-isobaric ensemble over a wide temperature range from 65 to 665 K for different alkyl chain lengths, n = 12, 14, 16, and 18. The simulations are able to reproduce the essential Features of the experimental observations of this system, including the existence of a transition, the linear variation of the transition temperature with alkyl chain length, and the expansion of the bilayer thickness at the transition. By use of the distance fluctuation Criteria, it is Shown that the transition is associated With a Melting of the alkyl chains of the anchored bilayer. Ail analysis of the conformation of the alkyl chains shows increased disorder in the form of gauche defects above due melting transition. Simulations also show that the melting transition is characterized by the complete disappearance of all-trans alkyl chains in the anchored bilayer, in agreement with experimental observations. A conformationally disordered chain has a larger effective cross-sectional area, and above due transition a uniformly tilted arrangement of the anchored chains call no longer be Sustained. At the melt the angular distribution of the orientation of the chains are 110 longer uniform; the chains are splayed allowing for increased space for individual chains of the anchored bilayer. This is reflected in a sharp rise in the ratio of the mean head-to-head to tail-to-tail distance of the chains of the bilayer at the transition resulting in in expansion of the bilayer thickness. The present MD simulations provide a simple explanation as to how changes in conformation of individual alkyl-chains gives rise to the observed increase in the interlayer lattice spacing of (CnH2n+1NH3)(2)PbI4 at the melting transition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Active Fiber Composites (AFC) possess desirable characteristics over a wide range of smart structure applications, such as vibration, shape and flow control as well as structural health monitoring. This type of material, capable of collocated actuation and sensing, call be used in smart structures with self-sensing circuits. This paper proposes four novel applications of AFC structures undergoing torsion: sensors and actuators shaped as strips and tubes; and concludes with a preliminary failure analysis. To enable this, a powerful mathematical technique, the Variational Asymptotic Method (VAM) was used to perform cross-sectional analyses of thin generally anisotropic AFC beams. The resulting closed form expressions have been utilized in the applications presented herein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"We thank MrGilder for his considered comments and suggestions for alternative analyses of our data. We also appreciate Mr Gilder’s support of our call for larger studies to contribute to the evidence base for preoperative loading with high-carbohydrate fluids..."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Body and Iron: Essays on the Socialness of Objects" focuses on the bodily-material interaction of human subjects and technical objects. It poses a question, how is it possible that objects have an impact on their human users and examines the preconditions of active efficacy of objects. In this theoretical task the work relies on various discussions drawing from realistic ontology, phenomenology of body, neurophysiology of Antonio Damasio and psychoanalysis to establish both objects and bodies as material entities related in a causal interaction with each other. Out of material interaction emerge a symbolic field, psyche and culture that produce representations of interactions with material world they remain dependent on and conditioned by. Interaction with objects informs the human body via its somatosensory systems: interoseptive and proprioseptive (or kinesthetic) systems provide information to central nervous system of the internal state of the body and muscle tensions and motor activity of the limbs. Capability to control the movements of one's body by the internal "feel" of being a body turns out to be a precondition to the ability to control artificial extensions of the body. Motor activity of the body is involved in every perception of environment as the feel of one's own body is constitutive of any perception of external objects. Perception of an object cause changes in the internal milieu of the body and these changes in the organism form a bodily representation of an external object. Via these "muscle images" the subject can develop a feel for an instrument. Bodily feel for an object is pre-conceptual, practical knowledge that resists articulation but allows sensing the world through the object. This is what I would call sensual knowledge. Technical objects intervene between body and environment, transforming the relation of perception and motor activity. Once connected to a vehicle, human subject has to calibrate visual information of his or her position and movement in space to the bodily actions controlling the machine. It is the machine that mediates the relation of human actions to the relation of her body to its environment. Learning to use the machine necessarily means adjusting his or her bodily actions to the responses of the machine in relation to environmental changes it causes. Responsiveness of the machine to human touch "teaches" its subject by providing feedback of the "correctitude" of his or her bodily actions. Correct actions form a body technique of handling the object. This is the way of socialness of objects. While responding to human actions they generate their subjects. Learning to handle a machine means accepting the position of the user in the program of action materialized in the construction of the object. Objects mediate, channel and transform the relation of the body to its environment and via environment to the body itself according to their material and technical construction. Objects are sensory media: they channel signals and information from the environment thus constituting a representation of environment, a virtual or artificial reality. They also feed the body directly with their powers equipping their user with means of regulating somatic and psychic states of her self. For these reasons humans look for the company of objects. Keywords: material objects, material culture, sociology of technology, sociology of body, mobility, driving

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the “Smart City” as a centralised service delivery platform predicted to optimise and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another “IT bubble” emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term “slacktivism” is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a “Like” button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and liveable human habitats. With this article, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centrepiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues. The rationale for this approach is an alternative to smart cities in a “perpetual tomorrow,” based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasise and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city poses. We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the media debate on pensions. The case analysed in the thesis is the debate that sharpened after the Finnish government made the decision to raise the retirement age. The analysed data consists of articles published in the printed media during one month after the decision was made on 24th of February in 2009. The aim of the study is to describe how the decision is argued about by different positions of speakers and how retirement is justified from the perspective of the individual. Furthermore, the purpose is to discover different ways of discussing the pensioner. The theoretical frame for this study is social constructivism, which understands reality as socially constructed with language. From this perspective, media texts can be seen as one form of shaping reality. The data is analysed by using different methods. Thematisation is used to discover the key topics, and quantification is used to examine the prevalence of different arguments. The method in which the speaker’s ways of speaking is analysed in different participant categories I call “a speaker position analysis”. The debate around the decision to raise the retirement age highlight the power struggle both between the government and the opposition as well as the government and employee unions. One thing all discussants agree is the need to raise the retirement age. From the individual's perspective, retirement is justified mostly with hard working conditions and inadequacy of health. The pensioner's image is appearing gloomy in most discourses. Prevailing discourses are seeing a pensioner either sick and tired or someone who is not good for work and has lost his dignity. The debate around the decision is intertwined around the concepts of welfare state and individual's well-being. In the postmodern society, human preferences are individualised. Welfare state means different things to different people, as well as the individual's subjective perception of well-being is unique. These two aspects are the ones which raise the tension in the analysed media debate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric particles affect the radiation balance of the Earth and thus the climate. New particle formation from nucleation has been observed in diverse atmospheric conditions but the actual formation path is still unknown. The prevailing conditions can be exploited to evaluate proposed formation mechanisms. This study aims to improve our understanding of new particle formation from the view of atmospheric conditions. The role of atmospheric conditions on particle formation was studied by atmospheric measurements, theoretical model simulations and simulations based on observations. Two separate column models were further developed for aerosol and chemical simulations. Model simulations allowed us to expand the study from local conditions to varying conditions in the atmospheric boundary layer, while the long-term measurements described especially characteristic mean conditions associated with new particle formation. The observations show statistically significant difference in meteorological and back-ground aerosol conditions between observed event and non-event days. New particle formation above boreal forest is associated with strong convective activity, low humidity and low condensation sink. The probability of a particle formation event is predicted by an equation formulated for upper boundary layer conditions. The model simulations call into question if kinetic sulphuric acid induced nucleation is the primary particle formation mechanism in the presence of organic vapours. Simultaneously the simulations show that ignoring spatial and temporal variation in new particle formation studies may lead to faulty conclusions. On the other hand, the theoretical simulations indicate that short-scale variations in temperature and humidity unlikely have a significant effect on mean binary water sulphuric acid nucleation rate. The study emphasizes the significance of mixing and fluxes in particle formation studies, especially in the atmospheric boundary layer. The further developed models allow extensive aerosol physical and chemical studies in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Caucasus region is a hotspot of biodiversity and is one of the few areas in the Northern Hemisphere which harbor Pleistocene glacial refugia. The region encompasses Armenia, Azerbaijan, Georgia, the southernmost European Russia, NE Turkey, and northern Iran. The study on fungal composition of the Caucasus region and its connection and possible contribution to the present mycota of Europe has largely escaped empirical scrutiny. Using taxonomic surveys, phylogenetic reconstruction methods, haplotype analysis, and similarity tests, this study has aimed to, 1) summarize the knowledge on the occurrence of corticioids and polypores in the Caucasus region, 2) resolve the phylogenetic relationships of selected, resupinate wood-inhabiting basidiomycetes for which the Caucasus region is currently the mere, or one of the noteworthy areas of distribution, and, 3) assess the similarity of Caucasian corticioid fungi to those of Europe and important areas in the Northern Hemisphere, and to examine the significance of the Caucasus region as a glacial refugium for these fungi. This study provides the first catalogue of corticioids and polypores (635 species) occurring in the Caucasus region. The phylogeny and systematics of the Caucasian resupinate taxa in focus has been resolved and the usefulness of some morphological characters has been re-evaluated. In this context, four new genera and two new species were described and five new combinations were proposed, two of which were supplemented with modern descriptions. The species composition of corticioids in the Caucasus region is found to be distinctly more similar to Europe and North America than to East Asia and India. The highest molecular diversity and within population pairwise distance for Peniophorella praetermissa has been detected in the Caucasus and East Asia, with the isolates of the latter area being highly divergent from the European ones. This, and the assignment of root haplotype to the Caucasian isolates in a haplotype network for Phlebia tuberucalta and P. livida, call attention to the role of the Caucasus region in shaping the current mycota of Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.