845 resultados para Objective element


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the traditional paradigm, the large power plants supply the reactive power required at a transmission level and the capacitors and transformer tap changer were also used at a distribution level. However, in a near future will be necessary to schedule both active and reactive power at a distribution level, due to the high number of resources connected in distribution levels. This paper proposes a new multi-objective methodology to deal with the optimal resource scheduling considering the distributed generation, electric vehicles and capacitor banks for the joint active and reactive power scheduling. The proposed methodology considers the minimization of the cost (economic perspective) of all distributed resources, and the minimization of the voltage magnitude difference (technical perspective) in all buses. The Pareto front is determined and a fuzzy-based mechanism is applied to present the best compromise solution. The proposed methodology has been tested in the 33-bus distribution network. The case study shows the results of three different scenarios for the economic, technical, and multi-objective perspectives, and the results demonstrated the importance of incorporating the reactive scheduling in the distribution network using the multi-objective perspective to obtain the best compromise solution for the economic and technical perspectives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we formulate the electricity retailers’ short-term decision-making problem in a liberalized retail market as a multi-objective optimization model. Retailers with light physical assets, such as generation and storage units in the distribution network, are considered. Following advances in smart grid technologies, electricity retailers are becoming able to employ incentive-based demand response (DR) programs in addition to their physical assets to effectively manage the risks of market price and load variations. In this model, the DR scheduling is performed simultaneously with the dispatch of generation and storage units. The ultimate goal is to find the optimal values of the hourly financial incentives offered to the end-users. The proposed model considers the capacity obligations imposed on retailers by the grid operator. The profit seeking retailer also has the objective to minimize the peak demand to avoid the high capacity charges in form of grid tariffs or penalties. The non-dominated sorting genetic algorithm II (NSGA-II) is used to solve the multi-objective problem. It is a fast and elitist multi-objective evolutionary algorithm. A case study is solved to illustrate the efficient performance of the proposed methodology. Simulation results show the effectiveness of the model for designing the incentive-based DR programs and indicate the efficiency of NSGA-II in solving the retailers’ multi-objective problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O objetivo deste trabalho é o desenvolvimento de frameworks de testes automáticos de software. Este tipo de testes normalmente está associado ao modelo evolucionário e às metodologias ágeis de desenvolvimento de software, enquanto que os testes manuais estão relacionados com o modelo em cascata e as metodologias tradicionais. Como tal foi efetuado um estudo comparativo sobre os tipos de metodologias e de testes existentes, para decidir quais os que melhor se adequavam ao projeto e dar resposta à questão "Será que realmente compensa realizar testes (automáticos)?". Finalizado o estudo foram desenvolvidas duas frameworks, a primeira para a implementação de testes funcionais e unitários sem dependências a ser utilizada pelos estagiários curriculares da LabOrders, e a segunda para a implementação de testes unitários com dependências externas de base de dados e serviços, a ser utilizada pelos funcionários da empresa. Nas últimas duas décadas as metodologias ágeis de desenvolvimento de software não pararam de evoluir, no entanto as ferramentas de automação não conseguiram acompanhar este progresso. Muitas áreas não são abrangidas pelos testes e por isso alguns têm de ser feitos manualmente. Posto isto foram criadas várias funcionalidades inovadoras para aumentar a cobertura dos testes e tornar as frameworks o mais intuitivas possível, nomeadamente: 1. Download automático de ficheiros através do Internet Explorer 9 (e versões mais recentes). 2. Análise do conteúdo de ficheiros .pdf (através dos testes). 3. Obtenção de elementos web e respetivos atributos através de código jQuery utilizando a API WebDriver com PHP bindings. 4. Exibição de mensagens de erro personalizadas quando não é possível encontrar um determinado elemento. As frameworks implementadas estão também preparadas para a criação de outros testes (de carga, integração, regressão) que possam vir a ser necessários no futuro. Foram testadas em contexto de trabalho pelos colaboradores e clientes da empresa onde foi realizado o projeto de mestrado e os resultados permitiram concluir que a adoção de uma metodologia de desenvolvimento de software com testes automáticos pode aumentar a produtividade, reduzir as falhas e potenciar o cumprimento de orçamentos e prazos dos projetos das organizações.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The epilepsy associated with the hypothalamic hamartomas constitutes a syndrome with peculiar seizures, usually refractory to medical therapy, mild cognitive delay, behavioural problems and multifocal spike activity in the scalp electroencephalogram (EEG). The cortical origin of spikes has been widely assumed but not specifically demonstrated. Methods: We present results of a source analysis of interictal spikes from 4 patients (age 2–25 years) with epilepsy and hypothalamic hamartoma, using EEG scalp recordings (32 electrodes) and realistic boundary element models constructed from volumetric magnetic resonance imaging (MRIs). Multifocal spike activity was the most common finding, distributed mainly over the frontal and temporal lobes. A spike classification based on scalp topography was done and averaging within each class performed to improve the signal to noise ratio. Single moving dipole models were used, as well as the Rap-MUSIC algorithm. Results: All spikes with good signal to noise ratio were best explained by initial deep sources in the neighbourhood of the hamartoma, with late sources located in the cortex. Not a single patient could have his spike activity explained by a combination of cortical sources. Conclusions: Overall, the results demonstrate a consistent origin of spike activity in the subcortical region in the neighbourhood of the hamartoma, with late spread to cortical areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain the Ph.D degree in Biology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As juntas adesivas têm vindo a ser usadas em diversas áreas e contam com inúmeras aplicações práticas. Devido ao fácil e rápido fabrico, as juntas de sobreposição simples (JSS) são um tipo de configuração bastante comum. O aumento da resistência, a redução de peso e a resistência à corrosão são algumas das vantagens que este tipo de junta oferece relativamente aos processos de ligação tradicionais. Contudo, a concentração de tensões nas extremidades do comprimento da ligação é uma das principais desvantagens. Existem poucas técnicas de dimensionamento precisas para a diversidade de ligações que podem ser encontradas em situações reais, o que constitui um obstáculo à utilização de juntas adesivas em aplicações estruturais. O presente trabalho visa comparar diferentes métodos analíticos e numéricos na previsão da resistência de JSS com diferentes comprimentos de sobreposição (LO). O objectivo fundamental é avaliar qual o melhor método para prever a resistência das JSS. Foram produzidas juntas adesivas entre substratos de alumínio utilizando um adesivo époxido frágil (Araldite® AV138), um adesivo epóxido moderadamente dúctil (Araldite® 2015), e um adesivo poliuretano dúctil (SikaForce® 7888). Consideraram-se diferentes métodos analíticos e dois métodos numéricos: os Modelos de Dano Coesivo (MDC) e o Método de Elementos Finitos Extendido (MEFE), permitindo a análise comparativa. O estudo possibilitou uma percepção crítica das capacidades de cada método consoante as características do adesivo utilizado. Os métodos analíticos funcionam apenas relativamente bem em condições muito específicas. A análise por MDC com lei triangular revelou ser um método bastante preciso, com excepção de adesivos que sejam bastante dúcteis. Por outro lado, a análise por MEFE demonstrou ser uma técnica pouco adequada, especialmente para o crescimento de dano em modo misto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: As Análises Clínicas são um precioso elemento entre os meios complementares de diagnóstico e terapêutica permitindo uma enorme panóplia de informações sobre o estado de saúde de determinado utente. O objetivo do laboratório é fornecer informação analítica sobre as amostras biológicas, sendo esta caracterizada pela sua fiabilidade, relevância e facultada em tempo útil. Assim, tratando-se de saúde, e mediante o propósito do laboratório, é notória a sua importância, bem como, a dos fatores associados para o cumprimento do mesmo. O bom desenrolar do ciclo laboratorial, compreendido pelas fases pré-analítica, analítica e pós-analítica é crucial para que o objetivo do laboratório seja cumprido com rigor e rapidez. O presente trabalho “O Erro na Fase Pré-Analítica: Amostras Não Conformes versus Procedimentos”, enquadrado no mestrado de Qualidade e Organização no Laboratório de Análises Clínicas, pretendeu enfatizar a importância da fase pré- analítica, sendo ela apontada como a primordial em erros que acabam por atrasar a saída de resultados ou por permitir que os mesmos não sejam fidedignos como se deseja, podendo acarretar falsos diagnósticos e decisões clínicas erradas. Esta fase, iniciada no pedido médico e finalizada com a chegada das amostras biológicas ao laboratório está entregue a uma diversidade de procedimentos que acarretam, por si só, uma grande diversidade de intervenientes, para além de uma variabilidade de factores que influenciam a amostra e seus resultados. Estes fatores, que podem alterar de algum modo a “veracidade” dos resultados analíticos, devem ser identificados e tidos em consideração para que estejamos convitos que os resultados auxiliam diagnósticos precisos e uma avaliação correta do estado do utente. As colheitas que por quaisquer divergências não originam amostras que cumpram o objectivo da sua recolha, não estando por isso em conformidade com o pretendido, constituem uma importante fonte de erro para esta fase pré-analítica. Neste estudo foram consultados os dados relativos a amostras de sangue e urina não conformes detetadas no laboratório, em estudo, durante o 1º trimestre de 2012, para permitir conhecer o tipo de falhas que acontecem e a sua frequência. Aos Técnicos de Análises Clínicas, colaboradores do laboratório, foi-lhes pedido que respondessem a um questionário sobre os seus procedimentos quotidianos e constituíssem, assim, a população desta 2ª parte do projeto. Preenchido e devolvido de forma anónima, este questionário pretendeu conhecer os procedimentos na tarefa de executar colheitas e, hipoteticamente, confrontá-los com as amostras não conformes verificadas. No 1ºsemestre de 2012 e num total de 25319 utentes registaram-se 146 colheitas que necessitaram de repetição por se verificarem não conformes. A “amostra não colhida” foi a não conformidade mais frequente (50%) versus a “má identificação” que registou somente 1 acontecimento. Houve ainda não conformidades que não se registaram como “preparação inadequada” e “amostra mal acondicionada”. Os técnicos revelaram-se profissionais competentes, conhecedores das tarefas a desempenhar e preocupados em executá-las com qualidade. Eliminar o erro não estará, seguramente, ao nosso alcance porém admitir a sua presença, detetá-lo e avaliar a sua frequência fará com que possamos diminuir a sua existência e melhorar a qualidade na fase pré-analítica, atribuindo-lhe a relevância que desempenha no processo laboratorial.-----------ABSTRACT:Clinical analyses are a precious element among diagnostic and therapeutic tests as they allow an enormous variety of information on the state of health of a user. The aim of the laboratory is to supply reliable, relevant and timely analytical information on biological samples. In health-related matters, in accordance with the objective of the laboratory, their importance is vital, as is the assurance that all the tools are in place for the fulfillment of its purpose. A good laboratory cycle, which includes the pre-analytical, analytical and post-analytical phases, is crucial in fulfilling the laboratory’s mission rapidly and efficiently. The present work - "Error in the pre-analytical phase: non-compliant samples versus procedures”, as part of the Master’s in Quality and Organization in the Clinical Analyses Laboratory, wishes to emphasize the importance of the pre-analytical phase, as the phase containing most errors which eventually lead to delays in the issue of results, or the one which enables those results not to be as reliable as desired, which can lead to false diagnosis and wrong clinical decisions. This phase, which starts with the medical request and ends with the arrival of the biological samples to the laboratory, entails a variety of procedures, which require the intervention of different players, not to mention a great number of factors, which influence the sample and the results. These factors, capable of somehow altering the “truth” of the analytical results, must be identified and taken into consideration so that we may ensure that the results help to make precise diagnoses and a correct evaluation of the user’s condition. Those collections which, due to any type of differences, do not originate samples capable of fulfilling their purpose, and are therefore not compliant with the objective, constitute an important source of error in this pre-analytical phase. In the present study, we consulted data from non-compliant blood and urine samples, detected at the laboratory during the 1st quarter of 2012, to find out the type of faults that happen and their frequency. The clinical analysis technicians working at the laboratory were asked to fill out a questionnaire regarding their daily procedures, forming in this way the population for this second part of the project. Completed and returned anonymously, this questionnaire intended to investigate the procedures for collections and, hypothetically, confront them with the verified non-compliant samples. In the first semester of 2012, and out of a total of 25319 users, 146 collections had to be repeated due to non-compliance. The “uncollected sample” was the most frequent non-compliance (>50%) versus “incorrect identification” which had only one occurrence. There were also unregistered non-compliance issues such as “inadequate preparation” and “inappropriately packaged sample”. The technicians proved to be competent professionals, with knowledge of the tasks they have to perform and eager to carry them out efficiently. We will certainly not be able to eliminate error, but recognizing its presence, detecting it and evaluating its frequency will help to decrease its occurrence and improve quality in the pre-analytical phase, giving it the relevance it has within the laboratory process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to compare the radiographic characteristics of patients with pulmonary tuberculosis (TB) and human immunodeficiency virus (HIV) infection with those of HIV-negative patients. In all, 275 TB patients attending the outpatients clinics at the University Hospital/UFPE, were studied from January 1997 to March 1999. Thirty nine (14.2%) of them were HIV+, with a higher frequency of males in this group (p=0.044). Seventy-five percent of the HIV+ patients and 19% of the HIV- had a negative tuberculin test (PPD) (p < 0.001). The proportion of positive sputum smears in the two groups was similar. The radiological finding most strongly associated with co-infection was absence of cavitation (p < 0.001). It may therefore be concluded that the lack of cavitation in patients with pulmonary TB may be considered a useful indicator of the need to investigate HIV infection. This approach could contribute to increasing the effectiveness of local health services, by offering appropriate treatment to co-infected patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this work was the development of polymeric structures, gel and films, generated from the dissolution of the Chitin-Glucan Complex (CGC) in biocompatible ionic liquids for biomedical applications. Similar as chitin, CGC is only soluble in some special solvents which are toxic and corrosive. Due to this fact and the urgent development of biomedical applications, the need to use biocompatible ionic liquids to dissolve the CGC is indispensable. For the dissolution of CGC, the biocompatible ionic liquid used was Choline acetate. Two different CGC’s, KiOnutrime from KitoZyme and biologically produced CGC from Faculdade de Ciencias e Tecnologia (FCT) - Universidade Nova de Lisboa, were characterized in order to develop biocompatible wound dressing materials. The similar result is shown in term of the ratio of chitin:glucan, which is 1:1.72 for CGC-FCT and 1:1.69 for CGC-Commercial. For the analysis of metal element content, water and inorganic salts content and protein content, both polymers showed some discrepancies, where the content in CGC-FCT is always higher compared to the commercial one. The different characterization results between CGC-FCT and CGC-Commercial could be addressed to differences in the purification method, and the difference of its original strain yeast, whereas CGC-FCT is derived from P.pastoris and the commercial CGC is from A.niger. This work also investigated the effect of biopolymers, temperature dissolution, non-solvent composition on the characteristics of generated polymeric structure with biocompatible ionic liquid. The films were prepared by casting a polymer mixture, immersion in a non-solvent, followed by drying at ambient temperature. Three different non-solvents were tested in phase inversion method, i.e. water, methanol, and glycerol. The results indicate that the composition of non-solvent in the coagulation bath has great influence in generated polymeric structure. Water was found to be the best coagulant for producing a CGC polymeric film structure. The characterizations that have been done include the analysis of viscosity and viscoelasticity measurement, as well as sugar composition in the membrane and total sugar that was released during the phase inversion method. The rheology test showed that both polymer mixtures exhibit a non- Newtonian shear thinning behaviour. Where the viscosity and viscoelasticity test reveal that CGCFCT mixture has a typical behaviour of a viscous solution with entangled polymer chains and CGCCommercial mixture has true gel behaviour. The experimental results show us that the generated CGC solution from choline acetate could be used to develop both polymeric film structure and gel. The generated structures are thermally stable at 100° C, and are hydrophilic. The produced films have dense structure and mechanical stabilities against puncture up to 60 kPa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the continuum growth of Internet connected devices, the scalability of the protocols used for communication between them is facing a new set of challenges. In robotics these communications protocols are an essential element, and must be able to accomplish with the desired communication. In a context of a multi-­‐‑agent platform, the main types of Internet communication protocols used in robotics, mission planning and task allocation problems will be revised. It will be defined how to represent a message and how to cope with their transport between devices in a distributed environment, reviewing all the layers of the messaging process. A review of the ROS platform is also presented with the intent of integrating the already existing communication protocols with the ServRobot, a mobile autonomous robot, and the DVA, a distributed autonomous surveillance system. This is done with the objective of assigning missions to ServRobot in a security context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A cross-sectional study of 120 subjects was performed with the purpose of evaluating stress hormones and emotional stress (anxiety) in outpatient and hospitalized subjects. The aims were to determine the degree of objective stress, as well as to correlate this finding with subjective findings, estimated using Beck's Anxiety Inventory.. METHOD: Three populations were investigated, namely outpatient clinical cases (Group I, n = 30), hospitalized clinical individuals (Group II, n = 30), and hospitalized surgical candidates (Group III, n = 30). Controls (Group IV, n = 30) were healthy volunteers who were health-care professionals and students. To avoid hormone interactions, only men were enrolled in all groups. All hospitalized subjects were tested on admission and before therapeutic interventions. Fasting epinephrine, norepinephrine, and cortisol were measured in the morning, and Beck's Anxiety Inventory was adminstered by a trained psychologist. RESULTS: The 3 patient groups displayed higher anxiety levels than the controls. Hormone concentrations did not present remarkable changes and did not correlate with subjective stress (anxiety). CONCLUSIONS: 1) Subjective disorders (as determined with Beck's Anxiety Inventory ) were a common finding in both outpatient and hospitalized populations, without differences between the various groups; 2) Objective stress (as determined by elevated hormone levels) was more difficult to confirm-findings rarely exceeded the reference range; 3) Correlation between the two variables could not be demonstrated; 4) Further studies are necessary to define stress quantification and interpretation in patient populations, especially in relationship with nutritional diagnosis and dietetic prescription.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work intends to present a newly developed test setup for dynamic out-of-plane loading using underWater Blast Wave Generators (WBWG) as loading source. Underwater blasting operations have been, during the last decades, subject of research and development of maritime blasting operations (including torpedo studies), aquarium tests for the measurement of blasting energy of industrial explosives and confined underwater blast wave generators. WBWG allow a wide range for the produced blast impulse and surface area distribution. It also avoids the generation of high velocity fragments and reduces atmospheric sound wave. A first objective of this work is to study the behavior of masonry infill walls subjected to blast loading. Three different masonry walls are to be studied, namely unreinforced masonry infill walls and two different reinforcement solutions. These solutions have been studied previously for seismic action mitigation. Subsequently, the walls will be simulated using an explicit finite element code for validation and parametric studies. Finally, a tool to help designers to make informed decisions on the use of infills under blast loading will be presented.