870 resultados para Combines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article combines both international practice and analytical contributions into a systematic and synthetic presentation of the evolution of peace operations from their modern inception in 1948 to the present. It seeks to serve a didactic purpose in proposing a basic structure for Brazilian scholars' burgeoning debate on peace operations and intervention, rather than a definitive characterization of blue helmet practice. Peace operations' progression is traced through five analytical "generations," each adding a crucial factor distinguishing it from its predecessors. Each generation is placed in relation to changes in the nature of conflict and in the interpretation of the foundational principles of peace operations, and links to broader theoretical issues in International Relations are made explicit at each stage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACTIn developing countries, initiatives have often been undertaken in order to fight social and environmental problems. Since the 1990s, an increase can be seen in corporate social responsibility actions, as well as increasingly strong activities by civil society organizations. Tweenty years ago, companies and civil society organizations stood wide apart from each other, with often conflicting agendas and resistance to mutual collaboration. This reality has changed significantly. Besides the phenomenon of cross-sector partnerships, we can also observe the expansion of a particular organization type, i.e., the social business, which combines two objectives that were previously seen as incompatible: financial sustainability and the generation of social value. This article aims to discuss the factors that influence the results of a social business operating in three countries: Botswana, Brazil and Jordan. The results allow understanding the challenges involved in constructing social businesses in developing countries as well as a better understanding of the very nature of those businesses, considering the social realities where they operate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel data analysis strategy which combined with subcellular fractionation and liquid chromatography-mass spectrometry (LC-MS) based proteomics provides a simple and effective workflow for global drug profiling. Five subcellular fractions were obtained by differential centrifugation followed by high resolution LC-MS and complete functional regulation analysis. The methodology combines functional regulation and enrichment analysis into a single visual summary. The workflow enables improved insight into perturbations caused by drugs. We provide a statistical argument to demonstrate that even crude subcellular fractions leads to improved functional characterization. We demonstrate this data analysis strategy on data obtained in a MS-based global drug profiling study. However, this strategy can also be performed on other types of large scale biological data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hoje em dia, há cada vez mais informação audiovisual e as transmissões ou ficheiros multimédia podem ser partilhadas com facilidade e eficiência. No entanto, a adulteração de conteúdos vídeo, como informação financeira, notícias ou sessões de videoconferência utilizadas num tribunal, pode ter graves consequências devido à importância desse tipo de informação. Surge então, a necessidade de assegurar a autenticidade e a integridade da informação audiovisual. Nesta dissertação é proposto um sistema de autenticação de vídeo H.264/Advanced Video Coding (AVC), denominado Autenticação de Fluxos utilizando Projecções Aleatórias (AFPA), cujos procedimentos de autenticação, são realizados ao nível de cada imagem do vídeo. Este esquema permite um tipo de autenticação mais flexível, pois permite definir um limite máximo de modificações entre duas imagens. Para efectuar autenticação é utilizada uma nova técnica de autenticação de imagens, que combina a utilização de projecções aleatórias com um mecanismo de correcção de erros nos dados. Assim é possível autenticar cada imagem do vídeo, com um conjunto reduzido de bits de paridade da respectiva projecção aleatória. Como a informação de vídeo é tipicamente, transportada por protocolos não fiáveis pode sofrer perdas de pacotes. De forma a reduzir o efeito das perdas de pacotes, na qualidade do vídeo e na taxa de autenticação, é utilizada Unequal Error Protection (UEP). Para validação e comparação dos resultados implementou-se um sistema clássico que autentica fluxos de vídeo de forma típica, ou seja, recorrendo a assinaturas digitais e códigos de hash. Ambos os esquemas foram avaliados, relativamente ao overhead introduzido e da taxa de autenticação. Os resultados mostram que o sistema AFPA, utilizando um vídeo com qualidade elevada, reduz o overhead de autenticação em quatro vezes relativamente ao esquema que utiliza assinaturas digitais e códigos de hash.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the aftermath of a large-scale disaster, agents' decisions derive from self-interested (e.g. survival), common-good (e.g. victims' rescue) and teamwork (e.g. fire extinction) motivations. However, current decision-theoretic models are either purely individual or purely collective and find it difficult to deal with motivational attitudes; on the other hand, mental-state based models find it difficult to deal with uncertainty. We propose a hybrid, CvI-JI, approach that combines: i) collective 'versus' individual (CvI) decisions, founded on the Markov decision process (MDP) quantitative evaluation of joint-actions, and ii)joint-intentions (JI) formulation of teamwork, founded on the belief-desire-intention (BDI) architecture of general mental-state based reasoning. The CvI-JI evaluation explores the performance's improvement

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates Portuguese companies’ use of the Internet to communicate social responsibility information, and the factors that affect this use. It examines the characteristics of companies that influence the prominence of social responsibility information on the Internet. Firm-specific factors that explain SRD by companies operating in a European country in which capital market fund raising is not regarded to be an important source of financing are analysed. The results are interpreted through the lens of a “political economy” framework which combines stakeholder and legitimacy theories perspectives, according to which companies disclose social responsibility information to present a socially responsible image so that they can legitimise their behaviours to their stakeholder groups and influence the external perception of reputation. Results suggest that a theoretical framework combining stakeholder and legitimacy theories may provide an explanatory basis for SRD by Portuguese companies. However, this study does not provide us with enough evidence to determine that the prominence given to CSR activities by Portuguese companies in their websites is linked to relationships with their stakeholders

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Characteristics of tunable wavelength filters based on a-SiC:H multi-layered stacked cells are studied both theoretically and experimentally. Results show that the light-activated photonic device combines the demultiplexing operation with the simultaneous photodetection and self amplification of an optical signal. The sensor is a bias wavelength current-controlled device that make use of changes in the wavelength of the background to control the power delivered to the load, acting a photonic active filter. Its gain depends on the background wavelength that controls the electrical field profile across the device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Characteristics of tunable wavelength pi'n/pin filters based on a-SiC:H multilayered stacked cells are studied both experimental and theoretically. Results show that the device combines the demultiplexing operation with the simultaneous photodetection and self amplification of the signal. An algorithm to decode the multiplex signal is established. A capacitive active band-pass filter model is presented and supported by an electrical simulation of the state variable filter circuit. Experimental and simulated results show that the device acts as a state variable filter. It combines the properties of active high-pass and low-pass filter sections into a capacitive active band-pass filter using a changing photo capacitance to control the power delivered to the load.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many current e-commerce systems provide personalization when their content is shown to users. In this sense, recommender systems make personalized suggestions and provide information of items available in the system. Nowadays, there is a vast amount of methods, including data mining techniques that can be employed for personalization in recommender systems. However, these methods are still quite vulnerable to some limitations and shortcomings related to recommender environment. In order to deal with some of them, in this work we implement a recommendation methodology in a recommender system for tourism, where classification based on association is applied. Classification based on association methods, also named associative classification methods, consist of an alternative data mining technique, which combines concepts from classification and association in order to allow association rules to be employed in a prediction context. The proposed methodology was evaluated in some case studies, where we could verify that it is able to shorten limitations presented in recommender systems and to enhance recommendation quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente artigo analisa o filme de Luis Alberto Pereira «Hans Staden» (1999), baseado no livro de Hans Staden «Duas Viagens ao Brasil» (1557), e o documentário/filme «Les Maîtres Fous» (1955) de Jean Rouch, tendo em consideração o pensamento antropofágico. Estas obras focalizam o choque cultural entre “civilizado” e “selvagem”, entre ritual canibal e ritual antropofágico, entre o «Nós» e os «Outros», encontros que permitem uma análise mais concreta à concepção de alteridade. A representação cinematográfica permite uma aproximação ao conceito antropofágico de apropriação da cultura externa, para posteriormente a reproduzir numa interpretação segundo a concepção ocidental do que figuram os rituais em questão. O Movimento Antropófago, pelo seu carácter vanguardista, concilia a matriz fundadora brasileira e ao mesmo tempo enaltece a irreverência de análise, e neste artigo serve de fundamento teórico e prático à decomposição dos exemplos. O pensamento antropófago e a sua aplicabilidade aos exemplos seleccionados permitem também aprofundar o estudo sobre o imaginário europeu enquanto recriação de relatos datados de viajantes ou colonizadores, pois a manutenção de um acervo estereotipado historicamente serve como forma de “legitimar” concepções. As duas obras focalizam a representação indígena e africana - o “selvagem” - na construção do imaginário ocidental - “civilizado” - dicotomia que nos permite desmistificar relações interculturais.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this work is to solve mathematical program with complementarity constraints (MPCC) using nonlinear programming techniques (NLP). An hyperbolic penalty function is used to solve MPCC problems by including the complementarity constraints in the penalty term. This penalty function [1] is twice continuously differentiable and combines features of both exterior and interior penalty methods. A set of AMPL problems from MacMPEC [2] are tested and a comparative study is performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.