967 resultados para Applications for european funds
Resumo:
Classification of large datasets is a challenging task in Data Mining. In the current work, we propose a novel method that compresses the data and classifies the test data directly in its compressed form. The work forms a hybrid learning approach integrating the activities of data abstraction, frequent item generation, compression, classification and use of rough sets.
Resumo:
Methods of diagnosis in Biomedical applications can be broadly divided into contact and non-contact based methods. So far, ultrasound based methods have been found to be most favorable for non-contact, non-invasive diagnosis, especially in the case of tissue stiffness analysis. We report here, the fabrication and characterization details of a new contact based transducer system for qualitative determination of the stiffnesses of non-piezoelectric substrates using the phenomenon of Surface Acoustic Waves (SAW). Preliminary trials to study the functionality of this system were carried out on various metallic and non-metallic substrates, and the results were found to be satisfactory. To confirm the suitability of this system for biomedical applications, similar trials have been conducted on tissue mimicking phantoms with varying degrees of stiffness.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
This paper presents a glowworm swarm based algorithm that finds solutions to optimization of multiple optima continuous functions. The algorithm is a variant of a well known ant-colony optimization (ACO) technique, but with several significant modifications. Similar to how each moving region in the ACO technique is associated with a pheromone value, the agents in our algorithm carry a luminescence quantity along with them. Agents are thought of as glowworms that emit a light whose intensity is proportional to the associated luminescence and have a circular sensor range. The glowworms depend on a local-decision domain to compute their movements. Simulations demonstrate the efficacy of the proposed glowworm based algorithm in capturing multiple optima of a multimodal function. The above optimization scenario solves problems where a collection of autonomous robots is used to form a mobile sensor network. In particular, we address the problem of detecting multiple sources of a general nutrient profile that is distributed spatially on a two dimensional workspace using multiple robots.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Examines the symbolic significance of major events and their security provision in the historical and contemporary context of the European Code of Police Ethics. Stresses the potential of major events to set new practical policing and security standards of technology and in doing so necessitiate the maintenance of professional ethical standards for policing in Europe.
Resumo:
Nanotechnology is a new technology which is generating a lot of interest among academicians, practitioners and scientists. Critical research is being carried out in this area all over the world.Governments are creating policy initiatives to promote developments it the nanoscale science and technology developments. Private investment is also seeing a rising trend. Large number of academic institutions and national laboratories has set up research centers that are workingon the multiple applications of nanotechnology. Wide ranges of applications are claimed for nanotechnology. This consists of materials, chemicals, textiles, semiconductors, to wonder drug delivery systems and diagnostics. Nanotechnology is considered to be a next big wave of technology after information technology and biotechnology. In fact, nanotechnology holds the promise of advances that exceed those achieved in recent decades in computers and biotechnology. Much interest in nanotechnology also could be because of the fact that enormous monetary benefits are expected from nanotechnology based products. According to NSF, revenues from nanotechnology could touch $ 1 trillion by 2015. However much of the benefits are projected ones. Realizing claimed benefits require successful development of nanoscience andv nanotechnology research efforts. That is the journey of invention to innovation has to be completed. For this to happen the technology has to flow from laboratory to market. Nanoscience and nanotechnology research efforts have to come out in the form of new products, new processes, and new platforms.India has also started its Nanoscience and Nanotechnology development program in under its 10(th) Five Year Plan and funds worth Rs. One billion have been allocated for Nanoscience and Nanotechnology Research and Development. The aim of the paper is to assess Nanoscience and Nanotechnology initiatives in India. We propose a conceptual model derived from theresource based view of the innovation. We have developed a structured questionnaire to measure the constructs in the conceptual model. Responses have been collected from 115 scientists and engineers working in the field of Nanoscience and Nanotechnology. The responses have been analyzed further by using Principal Component Analysis, Cluster Analysis and Regression Analysis.
Resumo:
This study addresses the issue of multilingualism in EU law. More specifically, it explores the implications of multilingualism for conceptualising legal certainty, a central principle of law both in domestic and EU legal systems. The main question addressed is how multilingualism and legal certainty may be reconciled in the EU legal system. The study begins with a discussion on the role of translation in drafting EU legislation and its implications for interpreting EU law at the European Court of Justice (ECJ). Uncertainty regarding the meaning of multilingual EU law and the interrelationship between multilingualism and ECJ methods of interpretation are explored. This analysis leads to questioning the importance of linguistic-semantic methods of interpretation, especially the role of comparing language versions for clarifying meaning and the ordinary meaning thesis, and to placing emphasis on other, especially the teleological, purpose-oriented method of interpretation. As regards the principle of legal certainty, the starting-point is a two-dimensional concept consisting of both formal and substantive elements; of predictability and acceptability. Formal legal certainty implies that laws and adjudication, in particular, must be predictable. Substantive legal certainty is related to rational acceptability of judicial decision-making placing emphasis on its acceptability to the legal community in question. Contrary to predictability that one might intuitively relate to linguistic-semantic methods of interpretation, the study suggests a new conception of legal certainty where purpose, telos, and other dynamic methods of interpretation are of particular significance for meaning construction in multilingual EU law. Accordingly, the importance of purposive, teleological interpretation as the standard doctrine of interpretation in a multilingual legal system is highlighted. The focus on rational, substantive acceptability results in emphasising discourse among legal actors among the EU legal community and stressing the need to give reasons in favour of proposed meaning in accordance with dynamic methods of interpretation including considerations related to purposes, aims, objectives and consequences. In this context, the role of ideal discourse situations and communicative action taking the form of interaction among the EU legal community in an ongoing dialogue especially in the preliminary ruling procedure is brought into focus. In order for this dialogue to function, it requires that the ECJ gives persuasive, convincing and acceptable reasons in justifying its decisions. This necessitates transparency, sincerity, and dialogue with the relevant audience.
Resumo:
This Master's Thesis defines the debt policy of the current European Union Member States towards the developing nations. Since no official policy for debt exists in the EU, it is defined to include debt practices (loans and debt relief in development cooperation) and debt within the EU development policy framework. This study (1) describes how the issue of external debt appears in the development policy framework, (2) compares EU Member States' given loans and debt relief to grants for the developing nations (1960s to the 2000s), and (3) measures the current orientation in ODA of each EU Member State between grant aid and loan aid using the Grant-Loan Index (GLI). Theoretical aspects include reasons for selecting between loans (Bouchet 1987) and grants (Odedokun 2004, O'Brien and Williams 2007), policy context of the EU (Van Reisen 2007) and the meaning of external debt in the set-up between the North and the South. In terms of history, the events and impact of the colonial period (where loans have originated) are overviewed and compared in light of today's policies. Development assistance statistics are derived from the OECD DAC statistics portal and EU development policy framework documents from the EU portal. Methodologically, the structure of this study is from policy analysis (Barrien 1999, Hill 2008, Berndtson 2008), but it has been modified to fit the needs of studying a non-official policy. EU Member States are divided into three groups by Carbone (2007a), the Big-3, Northern and Southern donors, based on common development assistance characteristics. The Grant-Loan Index is used to compare Carbone's model, which measures quality of aid, to the GLI measuring the structure of aid. Results indicate that EU- 15 countries (active in debt practices) differ in terms of timing, stability and equality of debt practices in the long-term (1960s to the 2000s). In terms of current practices, (2000-2008), it is noted that there lies a disparity between the actual practices and the way in which external debt is represented in the development policy framework, although debt practices form a relevant portion of total ODA practices for many EU-15 Member States, the issue itself plays a minor role in development policy documents. Carbone’s group division applies well to the Grant – Loan Index’s results, indicating that countries with similar development policy behaviour have similarities in debt policy behaviour, with one exception: Greece. On the basis of this study, it is concluded that EU development policy framework content in terms of external debt and debt practices are not congruent. The understanding of this disparity between the policy outline and differences in long-term practices is relevant in both, reaching the UN’s Millennium Development Goals, and in the actual process of developing development aid.
Resumo:
The purpose of this study is to examine how transformation is defining feminist bioethics and to determine the nature of this transformation. Behind the quest for transformation is core feminism and its political implications, namely, that women and other marginalized groups have been given unequal consideration in society and the sciences and that this situation is unacceptable and should be remedied. The goal of the dissertation is to determine how feminist bioethicists integrate the transformation into their respective fields and how they apply the potential of feminism to bioethical theories and practice. On a theoretical level, feminist bioethicists wish to reveal how current ways of knowing are based on inequality. Feminists pay special attention especially to communal and political contexts and to the power relations endorsed by each community. In addition, feminist bioethicists endorse relational ethics, a relational account of the self in which the interconnectedness of persons is important. On the conceptual level, feminist bioethicists work with beliefs, concepts, and practices that give us our world. As an example, I examine how feminist bioethicists have criticized and redefined the concept of autonomy. Feminist bioethicists emphasize relational autonomy, which is based on the conviction that social relationships shape moral identities and values. On the practical level, I discuss stem cell research as a test case for feminist bioethics and its ability to employ its methodologies. Analyzing these perspectives allowed me first, to compare non-feminist and feminist accounts of stem cell ethics and, second, to analyze feminist perspectives on the novel biotechnology. Along with offering a critical evaluation of the stem cell debate, the study shows that sustainable stem cell policies should be grounded on empirical knowledge about how donors perceive stem cell research and the donation process. The study indicates that feminist bioethics should develop the use of empirical bioethics, which takes the nature of ethics seriously: ethical decisions are provisional and open for further consideration. In addition, the study shows that there is another area of development in feminist bioethics: the understanding of (moral) agency. I argue that agency should be understood to mean that actions create desires.
Resumo:
Let X(t) be a right continuous temporally homogeneous Markov pro- cess, Tt the corresponding semigroup and A the weak infinitesimal genera- tor. Let g(t) be absolutely continuous and r a stopping time satisfying E.( S f I g(t) I dt) < oo and E.( f " I g'(t) I dt) < oo Then for f e 9iJ(A) with f(X(t)) right continuous the identity Exg(r)f(X(z)) - g(O)f(x) = E( 5 " g'(s)f(X(s)) ds) + E.( 5 " g(s)Af(X(s)) ds) is a simple generalization of Dynkin's identity (g(t) 1). With further restrictions on f and r the following identity is obtained as a corollary: Ex(f(X(z))) = f(x) + k! Ex~rkAkf(X(z))) + n-1E + (n ) )!.E,(so un-1Anf(X(u)) du). These identities are applied to processes with stationary independent increments to obtain a number of new and known results relating the moments of stopping times to the moments of the stopped processes.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
This article describes recent developments in the design and implementation of various strategies towards the development of novel therapeutics using first principles from biology and chemistry. Strategies for multi-target therapeutics and network analysis with a focus on cancer and HIV are discussed. Methods for gene and siRNA delivery are presented along with challenges and opportunities for siRNA therapeutics. Advances in protein design methodology and screening are described, with a focus on their application to the design of antibody based therapeutics. Future advances in this area relevant to vaccine design are also mentioned.