842 resultados para Theoretical model and wind action


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pivaloyl-L-Pro-Aib-N-methylamihdaes been shown to possess one intramolecular hydrogen bond in (CD&SO solution, by 'H-nmr methods, suggesting the existence of p-turns, with Pro-Aib as the corner residues. Theoretical conformational analysis suggests that Type II P-turn conformations are about 2 kcal mol-' more stable than Type 111 structures. A crystallographic study has established the Type I1 /%turn in the solid state. The molecule crystallizes in the space group P21 with a = 5.865 8, b = 11.421 A, c = 12.966 A, /3 = 97.55", and 2 = 2. The structure has been refined to a final R value of 0.061. The Type I1 p-turn conformation is stabilized by an intramolecular 4 - 1 hydrogen bond between the methylamide NH and the pivaloyl CO group. The conformational angles are @pro= -57.8", $pro = 139.3', @Aib = 61.4', and $Ajb = 25.1'. The Type 11 /%turn conformation for Pro-Aib in this peptide is compared with the Type I11 structures observed for the same segment in larger peptides.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensible Markup Language ( XML) has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing, there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Adaptive Genetic Algorithms and multi class Support Vector Machine ( SVM) is used to learn a user model. Based on the feedback from the users, the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents, indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensible Markup Language ( XML) has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing, there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Adaptive Genetic Algorithms and multi class Support Vector Machine ( SVM) is used to learn a user model. Based on the feedback from the users, the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents, indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to examine how transformation is defining feminist bioethics and to determine the nature of this transformation. Behind the quest for transformation is core feminism and its political implications, namely, that women and other marginalized groups have been given unequal consideration in society and the sciences and that this situation is unacceptable and should be remedied. The goal of the dissertation is to determine how feminist bioethicists integrate the transformation into their respective fields and how they apply the potential of feminism to bioethical theories and practice. On a theoretical level, feminist bioethicists wish to reveal how current ways of knowing are based on inequality. Feminists pay special attention especially to communal and political contexts and to the power relations endorsed by each community. In addition, feminist bioethicists endorse relational ethics, a relational account of the self in which the interconnectedness of persons is important. On the conceptual level, feminist bioethicists work with beliefs, concepts, and practices that give us our world. As an example, I examine how feminist bioethicists have criticized and redefined the concept of autonomy. Feminist bioethicists emphasize relational autonomy, which is based on the conviction that social relationships shape moral identities and values. On the practical level, I discuss stem cell research as a test case for feminist bioethics and its ability to employ its methodologies. Analyzing these perspectives allowed me first, to compare non-feminist and feminist accounts of stem cell ethics and, second, to analyze feminist perspectives on the novel biotechnology. Along with offering a critical evaluation of the stem cell debate, the study shows that sustainable stem cell policies should be grounded on empirical knowledge about how donors perceive stem cell research and the donation process. The study indicates that feminist bioethics should develop the use of empirical bioethics, which takes the nature of ethics seriously: ethical decisions are provisional and open for further consideration. In addition, the study shows that there is another area of development in feminist bioethics: the understanding of (moral) agency. I argue that agency should be understood to mean that actions create desires.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive compensation and managerial behavior have received an increasing amount of attention in the financial economics literature since the mid 1970s. The purpose of this thesis is to extend our understanding of managerial compensation, especially how stock option compensation is linked to the actions undertaken by the management. Furthermore, managerial compensation is continuously and heatedly debated in the media and an emerging consensus from this discussion seems to be that there still exists gaps in our knowledge of optimal contracting. In Finland, the first executive stock options were introduced in the 1980s and throughout the last 15 years it has become increasingly popular for Finnish listed firms to use this type of managerial compensation. The empirical work in the thesis is conducted using data from Finland, in contrast to most previous studies that predominantly use U.S. data. Using Finnish data provides insight of how market conditions affect compensation and managerial action and provides an opportunity to explore what parts of the U.S. evidence can be generalized to other markets. The thesis consists of four essays. The first essay investigates the exercise policy of the executive stock option holders in Finland. In summary, Essay 1 contributes to our understanding of the exercise policies by examining both the determinants of the exercise decision and the markets reaction to the actual exercises. The second essay analyzes the factors driving stock option grants using data for Finnish publicly listed firms. Several agency theory based variables are found to have have explanatory power on the likelihood of a stock option grant. Essay 2 also contributes to our understanding of behavioral factors, such as prior stock return, as determinants of stock option compensation. The third essay investigates the tax and stock option motives for share repurchases and dividend distributions. We document strong support for the tax motive for share repurchases. Furthermore, we also analyze the dividend distribution decision in companies with stock options and find a significant difference between companies with and without dividend protected options. We thus document that the cutting of dividends found in previous U.S. studies can be avoided by dividend protection. In the fourth essay we approach the puzzle of negative skewness in stock returns from an altogether different angle than in previous studies. We suggest that negative skewness in stock returns is generated by management disclosure practices and find proof for this. More specifically, we find that negative skewness in daily returns is induced by returns for days when non-scheduled firm specific news is disclosed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this two-part series of papers, a generalized non-orthogonal amplify and forward (GNAF) protocol which generalizes several known cooperative diversity protocols is proposed. Transmission in the GNAF protocol comprises of two phases - the broadcast phase and the cooperation phase. In the broadcast phase, the source broadcasts its information to the relays as well as the destination. In the cooperation phase, the source and the relays together transmit a space-time code in a distributed fashion. The GNAF protocol relaxes the constraints imposed by the protocol of Jing and Hassibi on the code structure. In Part-I of this paper, a code design criteria is obtained and it is shown that the GNAF protocol is delay efficient and coding gain efficient as well. Moreover GNAF protocol enables the use of sphere decoders at the destination with a non-exponential Maximum likelihood (ML) decoding complexity. In Part-II, several low decoding complexity code constructions are studied and a lower bound on the Diversity-Multiplexing Gain tradeoff of the GNAF protocol is obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent spurt of research activities in Entity-Relationship Approach to databases calls for a close scrutiny of the semantics of the underlying Entity-Relationship models, data manipulation languages, data definition languages, etc. For reasons well known, it is very desirable and sometimes imperative to give formal description of the semantics. In this paper, we consider a specific ER model, the generalized Entity-Relationship model (without attributes on relationships) and give denotational semantics for the model as well as a simple ER algebra based on the model. Our formalism is based on the Vienna Development Method—the meta language (VDM). We also discuss the salient features of the given semantics in detail and suggest directions for further work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have compared the total as well as fine mode aerosol optical depth (tau and tau(fine)) retrieved by Moderate Resolution Imaging Spectroradiometer (MODIS) onboard Terra and Aqua (2001-2005) with the equivalent parameters derived by Aerosol Robotic Network (AERONET) at Kanpur (26.45 degrees N, 80.35 degrees E), northern India. MODIS Collection 005 (C005)-derived tau(0.55) was found to be in good agreement with the AERONET measurements. The tau(fine) and eta (tau(fine)/tau) were, however, biased low significantly in most matched cases. A new set of retrieval with the use of absorbing aerosol model (SSA similar to 0.87) with increased visible surface reflectance provided improved tau and tau(fine) at Kanpur. The new derivation of eta also compares well qualitatively with an independent set of in situ measurements of accumulation mass fraction over much of the southern India. This suggests that though MODIS land algorithm has limited information to derive size properties of aerosols over land, more accurate parameterization of aerosol and surface properties within the existing C005 algorithm may improve the accuracy of size-resolved aerosol optical properties. The results presented in this paper indicate that there is a need to reconsider the surface parameterization and assumed aerosol properties in MODIS C005 algorithm over the Indian region in order to retrieve more accurate aerosol optical and size properties, which are essential to quantify the impact of human-made aerosols on climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The granular flow down an inclined plane is simulated using the discrete element (DE) technique to examine the extent to which the dynamics of an unconfined dense granular flow can be well described by a hard particle model First, we examine the average coordination number for the particles in the flow down an inclined plane using the DE technique using the linear contact model with and without friction, and the Hertzian contact model with friction The simulations show that the average coordination number decreases below 1 for values of the spring stiffness corresponding to real materials, such as sand and glass, even when the angle of inclination is only 10 larger than the angle of repose Additional measures of correlations in the system, such as the fraction of particles with multibody contact, the force ratio (average ratio of the magnitudes of the largest and the second largest force on a particle), and the angle between the two largest forces on the particle, show no evidence of force chains or other correlated motions in the system An analysis of the bond-orientational order parameter indicates that the flow is in the random state, as in event-driven (ED) simulations V Kumaran, J Fluid Mech 632, 107 (2009), J Fluid Mech 632, 145 (2009)] The results of the two simulation techniques for the Bagnold coefficients (ratio of stress and square of the strain rate) and the granular temperature (mean square of the fluctuating velocity) are compared with the theory V Kumaran, J Fluid Mech 632, 107 (2009), J Fluid Mech 632, 145 (2009)] and are found to be in quantitative agreement In addition, we also conduct a comparison of the collision frequency and the distribution of the precollisional relative velocities of particles in contact The strong correlation effects exhibited by these two quantities in event-driven simulations V Kumaran, J Fluid Mech 632, 145 (2009)] are also found in the DE simulations (C) 2010 American Institute of Physics doi 10 1063/1 3504660]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that fatigue in concrete causes excessive deformations and cracking leading to structural failures. Due to quasi-brittle nature of concrete and formation of a fracture process zone, the rate of fatigue crack growth depends on a number of parameters, such as, the tensile strength, fracture toughness, loading ratio and most importantly the structural size. In this work, an analytical model is proposed for estimating the fatigue crack growth in concrete by using the concepts of dimensional analysis and including the above parameters. Knowing the governed and the governing parameters of the physical problem and by using the concepts of self-similarity, a relationship is obtained between different parameters involved. It is shown that the proposed fatigue law is able to capture the size effect in plain concrete and agrees well with different experimental results. Through a sensitivity analysis, it is shown that the structural size plays a dominant role followed by loading ratio and the initial crack length in fatigue crack propagation. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beavers are often found to be in conflict with human interests by creating nuisances like building dams on flowing water (leading to flooding), blocking irrigation canals, cutting down timbers, etc. At the same time they contribute to raising water tables, increased vegetation, etc. Consequently, maintaining an optimal beaver population is beneficial. Because of their diffusion externality (due to migratory nature), strategies based on lumped parameter models are often ineffective. Using a distributed parameter model for beaver population that accounts for their spatial and temporal behavior, an optimal control (trapping) strategy is presented in this paper that leads to a desired distribution of the animal density in a region in the long run. The optimal control solution presented, imbeds the solution for a large number of initial conditions (i.e., it has a feedback form), which is otherwise nontrivial to obtain. The solution obtained can be used in real-time by a nonexpert in control theory since it involves only using the neural networks trained offline. Proper orthogonal decomposition-based basis function design followed by their use in a Galerkin projection has been incorporated in the solution process as a model reduction technique. Optimal solutions are obtained through a "single network adaptive critic" (SNAC) neural-network architecture.