52 resultados para Module average case analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies musical opus from the point of view of three mathematical tools: entropy, pseudo phase plane (PPP), and multidimensional scaling (MDS). The experiments analyze ten sets of different musical styles. First, for each musical composition, the PPP is produced using the time series lags captured by the average mutual information. Second, to unravel hidden relationships between the musical styles the MDS technique is used. The MDS is calculated based on two alternative metrics obtained from the PPP, namely, the average mutual information and the fractal dimension. The results reveal significant differences in the musical styles, demonstrating the feasibility of the proposed strategy and motivating further developments towards a dynamical analysis of musical sounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel agent-based approach to Meta-Heuristics self-configuration is proposed in this work. Meta-heuristics are examples of algorithms where parameters need to be set up as efficient as possible in order to unsure its performance. This paper presents a learning module for self-parameterization of Meta-heuristics (MHs) in a Multi-Agent System (MAS) for resolution of scheduling problems. The learning is based on Case-based Reasoning (CBR) and two different integration approaches are proposed. A computational study is made for comparing the two CBR integration perspectives. In the end, some conclusions are reached and future work outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a Self-Optimizing module, inspired on Autonomic Computing, acquiring a scheduling system with the ability to automatically select a Meta-heuristic to use in the optimization process, so as its parameterization. Case-based Reasoning was used so the system may be able of learning from the acquired experience, in the resolution of similar problems. From the obtained results we conclude about the benefit of its use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central place hospitals occupy in health systems transforms them into prime target of healthcare reforms. This study aims to identify current trends in organizational structure change in public hospitals and explore the role of accounting in attempts to develop controls over professionals within public hospitals. The analytical framework we proposed crosses the concept of “new professionalism” (Evetts, 2010), with the concept of “accounting logic” for controlling professionals (Broadbent and Laughlin, 1995). Looking for a more holistic overview, we developed a qualitative and exploratory study. The data were collected trough semi-structured interviews with doctors of a clinical hospital unit. Content analysis suggests that, although we cannot say that there is a complete and generalized integration of accounting information in the clinical decisions, important improvement has been made in that area. Despite the extensive literature developed on this topic, there is any empirical studies of authors are aware that allow us to realize how real doctors in reals day-to-day work integrated these trends of change in theirs clinical decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this paper is to promote qualitative methodology within the scientific community of management. The specific objective is oriented to propose an empirical research process based on case study method. This is to ensure rigor in the empirical research process, that future research may follow a similar procedure to that is proposed. Design/methodology/approach: Following a qualitative methodological approach, we propose a research process that develops according to four phases, each with several stages. This study analyses the preparatory and field work phases and their stages. Findings: The paper shows the influence that case studies have on qualitative empirical research process in management. Originality/value:. Case study method assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introdução: O Acidente Vascular Encefálico (AVE) consiste numa das primeiras causas de mortalidade e morbilidade em Portugal. Esta lesão do Sistema Nervoso Central (SNC) desencadeia alterações ao nível do controlo postural (CP), que interferem com a recuperação funcional dos indivíduos. Objetivo: Deste modo, torna-se premente descrever as alterações do CP do tronco através da análise dos alinhamentos dos segmentos corporais do tronco no grupo de indivíduos selecionados, face à aplicação de um programa de intervenção baseado nos princípios do Conceito de Bobath. Metodologia: Estudo de série de casos, em seis indivíduos com alterações neuromotoras decorrentes de AVE, os quais foram avaliados antes e após o plano de intervenção segundo a abordagem baseada nos princípios do Conceito de Bobath, através do registo observacional, da Classificação Internacional de Funcionalidade Incapacidade e Saúde (CIF), da utilização do Software de Avaliação Postural (SAPO) e da Plataforma de Pressões da Emed (PPE), modelo AT. Os dados recolhidos foram trabalhados em função do valor médio através do software Excel. Resultados: A análise do SAPO, na posição ortostática observam-se mudanças quer na vista posterior quer nas laterais, indicando uma maior simetria entre hemitroncos, e mudanças nos alinhamentos verticais indicando uma maior aproximação dos 180º. Na PPE observam-se os valores da área plantar, da pressão plantar média e do centro de pressão, tendem globalmente a uma maior semelhança e simetria. Quanto à CIF também se verificou uma diminuição da restrição na participação e limitação na atividade. Conclusão: A intervenção baseada no processo de raciocínio clínico aparenta introduzir os estímulos necessários à reorganização funcional do SNC lesado, produzindo melhorias ao nível dos alinhamentos dos segmentos corporais e desta forma melhorar a atividade muscular.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introdução: O envolvimento respiratório é a principal causa de morbilidade e mortalidade na Fibrose Quística (FQ). Dados pediátricos sobre atividade física (AF), saturação periférica da oxi-hemoglobina (SpO2) e pico do fluxo da tosse (PFT) são escassos e não padronizados. Objetivos: Avaliar a função pulmonar (FP), AF, SpO2 e PFT, em crianças e adolescentes com FQ, no estado basal e em agudização (AR) e, na fase estável, avaliar a correlação entre as variáveis. Métodos: Realizou-se um estudo observacional prospetivo, com análise de espirometria, podometria, oximetria noturna e PFT, em condições basais. Na AR reavaliaram-se os mesmos parâmetros às 24-48 horas, 7, 15 e 30 dias, excetuando a AF aos 7 dias. Resultados: Avaliaram-se 8 doentes dos quais dois apresentaram um comprometimento ligeiro da FP e um moderado. A SpO2 foi de 96,2% [95,6; 96,6] e o número médio de passos/dia (NMP) foi de 6369 [4431; 10588]. Todos apresentaram valores do PFT inferiores ao percentil 5 para o género e idade (265 L/min [210; 290]). Apesar de não estatisticamente significativa, a correlação foi moderada entre FEV1 e SpO2 nocturna (rs =0,61; p=0,11); entre PFT e idade (rs=0,69; p=0,06); e entre PFT e capacidade vital forçada (CVF) (rs=0,54; p=0,17). Não se verificou correlação entre FEV1 e idade, NMP e PFT; e entre NMP e idade. No único caso de AR, à exceção da frequência respiratória, verificou-se a diminuição das variáveis às 24-48h; após 1 mês, a maioria das variáveis aproximou-se ou igualou os valores basais. Conclusão: Os resultados sugerem uma tendência para melhores valores de FEV1 corresponderem a melhores SpO2 noturnas e que, quanto maior a idade e a CVF, maior é o PFT. Não foi possível avaliar o impacto da AR por ter ocorrido apenas um caso.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different problems are daily discuss on environmental aspects such acid rain, eutrophication, global warming and an others problems. Rarely do we find some discussions about phosphorus problematic. Through the years the phosphorus as been a real problem and must be more discussed. On this thesis was done a global material flow analysis of phosphorus, based on data from the year 2004, the production of phosphate rock in that year was 18.9 million tones, almost this amount it was used as fertilizer on the soil and the plants only can uptake, on average, 20% of the input of fertilizer to grow up, the remainder is lost for the phosphorus soil. In the phosphorus soil there is equilibrium between the phosphorus available to uptake from the plants and the phosphorus associate with other compounds, this equilibrium depends of the kind of soil and is related with the soil pH. A reserve inventory was done and we have 15,000 million tones as reserve, the amount that is economical available. The reserve base is estimated in 47,000 million tones. The major reserves can be found in Morocco and Western Sahara, United Sates, China and South Africa. The reserve estimated in 2009 was 15,000 million tone of phosphate rock or 1,963 million tone of P. If every year the mined phosphate rock is around 22 Mt/yr (phosphorus production on 2008 USGS 2009), and each year the consumption of phosphorus increases because of the food demand, the reserves of phosphate rock will be finished in about 90 years, or maybe even less. About the value/impact assessment was done a qualitative analysis, if on the future we don’t have more phosphate rock to produce fertilizers, it is expected a drop on the crops yields, each depends of the kind of the soil and the impact on the humans feed and animal production will not be a relevant problem. We can recovery phosphorus from different waste streams such as ploughing crop residues back into the soil, Food processing plants and food retailers, Human and animal excreta, Meat and bone meal, Manure fibre, Sewage sludge and wastewater. Some of these examples are developed in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste trabalho pretende-se introduzir os conceitos associados à lógica difusa no controlo de sistemas, neste caso na área da robótica autónoma, onde é feito um enquadramento da utilização de controladores difusos na mesma. Foi desenvolvido de raiz um AGV (Autonomous Guided Vehicle) de modo a se implementar o controlador difuso, e testar o desempenho do mesmo. Uma vez que se pretende de futuro realizar melhorias e/ou evoluções optou-se por um sistema modular em que cada módulo é responsável por uma determinada tarefa. Neste trabalho existem três módulos que são responsáveis pelo controlo de velocidade, pela aquisição dos dados dos sensores e, por último, pelo controlador difuso do sistema. Após a implementação do controlador difuso, procedeu-se a testes para validar o sistema onde foram recolhidos e registados os dados provenientes dos sensores durante o funcionamento normal do robô. Este dados permitiram uma melhor análise do desempenho do robô. Verifica-se que a lógica difusa permite obter uma maior suavidade na transição de decisões, e que com o aumento do número de regras é possível tornar o sistema ainda mais suave. Deste modo, verifica-se que a lógica difusa é uma ferramenta útil e funcional para o controlo de aplicações. Como desvantagem surge a quantidade de dados associados à implementação, tais como, os universos de discurso, as funções de pertença e as regras. Ao se aumentar o número de regras de controlo do sistema existe também um aumento das funções de pertença consideradas para cada variável linguística; este facto leva a um aumento da memória necessária e da complexidade na implementação pela quantidade de dados que têm de ser tratados. A maior dificuldade no projecto de um controlador difuso encontra-se na definição das variáveis linguísticas através dos seus universos de discurso e das suas funções de pertença, pois a definição destes pode não ser a mais adequada ao contexto de controlo e torna-se necessário efectuar testes e, consequentemente, modificações à definição das funções de pertença para melhorar o desempenho do sistema. Todos os aspectos referidos são endereçados no desenvolvimento do AGV e os respectivos resultados são apresentados e analisados.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constant evolution of the Internet and its increasing use and subsequent entailing to private and public activities, resulting in a strong impact on their survival, originates an emerging technology. Through cloud computing, it is possible to abstract users from the lower layers to the business, focusing only on what is most important to manage and with the advantage of being able to grow (or degrades) resources as needed. The paradigm of cloud arises from the necessity of optimization of IT resources evolving in an emergent and rapidly expanding and technology. In this regard, after a study of the most common cloud platforms and the tactic of the current implementation of the technologies applied at the Institute of Biomedical Sciences of Abel Salazar and Faculty of Pharmacy of Oporto University a proposed evolution is suggested in order adorn certain requirements in the context of cloud computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aiming the establishment of simple and accurate readings of citric acid (CA) in complex samples, citrate (CIT) selective electrodes with tubular configuration and polymeric membranes plus a quaternary ammonium ion exchanger were constructed. Several selective membranes were prepared for this purpose, having distinct mediator solvents (with quite different polarities) and, in some cases, p-tert-octylphenol (TOP) as additive. The latter was used regarding a possible increase in selectivity. The general working characteristics of all prepared electrodes were evaluated in a low dispersion flow injection analysis (FIA) manifold by injecting 500µl of citrate standard solutions into an ionic strength (IS) adjuster carrier (10−2 mol l−1) flowing at 3ml min−1. Good potentiometric response, with an average slope and a repeatability of 61.9mV per decade and ±0.8%, respectively, resulted from selective membranes comprising additive and bis(2-ethylhexyl)sebacate (bEHS) as mediator solvent. The same membranes conducted as well to the best selectivity characteristics, assessed by the separated solutions method and for several chemical species, such as chloride, nitrate, ascorbate, glucose, fructose and sucrose. Pharmaceutical preparations, soft drinks and beers were analyzed under conditions that enabled simultaneous pH and ionic strength adjustment (pH = 3.2; ionic strength = 10−2 mol l−1), and the attained results agreed well with the used reference method (relative error < 4%). The above experimental conditions promoted a significant increase in sensitivity of the potentiometric response, with a supra-Nernstian slope of 80.2mV per decade, and allowed the analysis of about 90 samples per hour, with a relative standard deviation <1.0%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bread is consumed worldwide by man, thus contributing to the regular ingestion of certain inorganic species such as chloride. It controls the blood pressure if associated to a sodium intake and may increase the incidence of stomach ulcer. Its routine control should thus be established by means of quick and low cost procedures. This work reports a double- channel flow injection analysis (FIA) system with a new chloride sensor for the analysis of bread. All solutions are prepared in water and necessary ionic strength adjustments are made on-line. The body of the indicating electrode is made from a silver needle of 0.8 mm i.d. with an external layer of silver chloride. These devices were constructed with different lengths. Electrodes of 1.0 to 3.0 cm presented better analytical performance. The calibration curves under optimum conditions displayed Nernstian behaviour, with average slopes of 56 mV decade-1, with sampling rates of 60 samples h-1. The method was applied to analyze several kinds of bread, namely pão de trigo, pão integral, pão de centeio, pão de mistura, broa de milho, pão sem sal, pão meio sal, pão-de-leite, and pão de água. The accuracy and precision of the potentiometric method were ascertained by comparison to a spectrophotometric method of continuous segmented flow. These methods were validated against ion-chromatography procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Glass fibre-reinforced plastics (GFRP), nowadays commonly used in the construction, transportation and automobile sectors, have been considered inherently difficult to recycle due to both the cross-linked nature of thermoset resins, which cannot be remoulded, and the complex composition of the composite itself, which includes glass fibres, polymer matrix and different types of inorganic fillers. Hence, to date, most of the thermoset based GFRP waste is being incinerated or landfilled leading to negative environmental impacts and additional costs to producers and suppliers. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. In this study, the effect of the incorporation of mechanically recycled GFRP pultrusion wastes on flexural and compressive behaviour of polyester polymer mortars (PM) was assessed. For this purpose, different contents of GFRP recyclates (0%, 4%, 8% and 12%, w/w), with distinct size grades (coarse fibrous mixture and fine powdered mixture), were incorporated into polyester PM as sand aggregates and filler replacements. The effect of the incorporation of a silane coupling agent was also assessed. Experimental results revealed that GFRP waste filled polymer mortars show improved mechanical behaviour over unmodified polyester based mortars, thus indicating the feasibility of GFRP waste reuse as raw material in concrete-polymer composites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.