991 resultados para distributed feedback laser diode (DFB LD)
Resumo:
XXXIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC 2015). 15 to 19, May, 2015, III Workshop de Comunicação em Sistemas Embarcados Críticos. Vitória, Brasil.
Resumo:
The treatment of vascular lesions of the tongue is a very challenging procedure since the maintenance of the lingual tissue is of critical importance. Numerous treatment options have been described in literature but the Nd:YAG Laser appears to be one of the safest therapeutic options. We described a successful treatment of vascular lesions of the tongue with an excellent clinical result after only one treatment session with the Nd:YAG laser, with conservation of the lingual tissue and its functionality.
Resumo:
Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
In this paper, we propose the Distributed using Optimal Priority Assignment (DOPA) heuristic that finds a feasible partitioning and priority assignment for distributed applications based on the linear transactional model. DOPA partitions the tasks and messages in the distributed system, and makes use of the Optimal Priority Assignment (OPA) algorithm known as Audsley’s algorithm, to find the priorities for that partition. The experimental results show how the use of the OPA algorithm increases in average the number of schedulable tasks and messages in a distributed system when compared to the use of Deadline Monotonic (DM) usually favoured in other works. Afterwards, we extend these results to the assignment of Parallel/Distributed applications and present a second heuristic named Parallel-DOPA (P-DOPA). In that case, we show how the partitioning process can be simplified by using the Distributed Stretch Transformation (DST), a parallel transaction transformation algorithm introduced in [1].
Resumo:
XXXIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC 2015), III Workshop de Comunicação em Sistemas Embarcados Críticos. Vitória, Brasil.
Resumo:
Fractional Calculus (FC) goes back to the beginning of the theory of differential calculus. Nevertheless, the application of FC just emerged in the last two decades, due to the progress in the area of chaos that revealed subtle relationships with the FC concepts. In the field of dynamical systems theory some work has been carried out but the proposed models and algorithms are still in a preliminary stage of establishment. Having these ideas in mind, the paper discusses a FC perspective in the study of the dynamics and control of some distributed parameter systems.
Resumo:
It is imperative to accept that failures can and will occur, even in meticulously designed distributed systems, and design proper measures to counter those failures. Passive replication minimises resource consumption by only activating redundant replicas in case of failures, as typically providing and applying state updates is less resource demanding than requesting execution. However, most existing solutions for passive fault tolerance are usually designed and configured at design time, explicitly and statically identifying the most critical components and their number of replicas, lacking the needed flexibility to handle the runtime dynamics of distributed component-based embedded systems. This paper proposes a cost-effective adaptive fault tolerance solution with a significant lower overhead compared to a strict active redundancy-based approach, achieving a high error coverage with the minimum amount of redundancy. The activation of passive replicas is coordinated through a feedback-based coordination model that reduces the complexity of the needed interactions among components until a new collective global service solution is determined, improving the overall maintainability and robustness of the system.
Resumo:
Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.
Resumo:
Further improvements in demand response programs implementation are needed in order to take full advantage of this resource, namely for the participation in energy and reserve market products, requiring adequate aggregation and remuneration of small size resources. The present paper focuses on SPIDER, a demand response simulation that has been improved in order to simulate demand response, including realistic power system simulation. For illustration of the simulator’s capabilities, the present paper is proposes a methodology focusing on the aggregation of consumers and generators, providing adequate tolls for the demand response program’s adoption by evolved players. The methodology proposed in the present paper focuses on a Virtual Power Player that manages and aggregates the available demand response and distributed generation resources in order to satisfy the required electrical energy demand and reserve. The aggregation of resources is addressed by the use of clustering algorithms, and operation costs for the VPP are minimized. The presented case study is based on a set of 32 consumers and 66 distributed generation units, running on 180 distinct operation scenarios.
Resumo:
The high penetration of distributed energy resources (DER) in distribution networks and the competitive environment of electricity markets impose the use of new approaches in several domains. The network cost allocation, traditionally used in transmission networks, should be adapted and used in the distribution networks considering the specifications of the connected resources. The main goal is to develop a fairer methodology trying to distribute the distribution network use costs to all players which are using the network in each period. In this paper, a model considering different type of costs (fixed, losses, and congestion costs) is proposed comprising the use of a large set of DER, namely distributed generation (DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehicles with capability of discharging energy to the network, which is known as vehicle-to-grid (V2G). The proposed model includes three distinct phases of operation. The first phase of the model consists in an economic dispatch based on an AC optimal power flow (AC-OPF); in the second phase Kirschen's and Bialek's tracing algorithms are used and compared to evaluate the impact of each resource in the network. Finally, the MW-mile method is used in the third phase of the proposed model. A distribution network of 33 buses with large penetration of DER is used to illustrate the application of the proposed model.
Resumo:
OBJETIVO: Avaliar a eficácia, a taxa de recorrência e as complicações da vaporização laser com CO2 no tratamento dos cistos da glândula de Bartholin. MÉTODOS: Estudo retrospectivo com 127 pacientes que apresentavam cistos sintomáticos da glândula de Bartholin submetidas à vaporização laser CO2 na nossa instituição de janeiro de 2005 a junho de 2011. Foram excluídas todas as pacientes com abcessos da glândula de Bartholin ou com suspeita de câncer. Todos os procedimentos foram realizados em regime ambulatorial, sob anestesia local. A coleta dos dados foi feita com base na consulta do processo clínico, tendo-se procedido à análise das características demográficas, dos parâmetros anatômicos, das complicações intra e pós-operatórias e dos dados de acompanhamento. Os dados foram armazenados e analisados no software Microsoft Excel® 2007, e os resultados foram apresentados como frequência (porcentagem) ou média±desvio padrão. As taxas de complicações, recorrência e cura foram calculadas. RESULTADOS: A idade média das pacientes foi de 37,3±9,5 anos (variando entre 18 e 61 anos). Setenta por cento(n=85) delas eram multíparas. A queixa mais frequente foi dor e 47,2% (n=60) das pacientes tinham antecedentes de tratamento médico e/ou cirúrgico por abcesso da glândula de Bartholin. A dimensão média dos cistos foi de 2,7±0,9 cm. Foram verificados três (2,4%) casos de hemorragia intraoperatória ligeira e 17 (13,4%) recorrências durante um período médio de 14,6 meses (variando entre 1 e 56 meses): dez abscessos da glândula de Bartholin e sete cistos recorrentes, que precisavam de uma nova intervenção cirúrgica. A taxa de cura após um único tratamento à laser foi de 86,6%. Dentre as cinco pacientes com doença recorrente que foram submetidas a um segundo procedimento com laser, a taxa de cura foi de 100%. CONCLUSÕES: Na presente instituição, a vaporização laser com CO2 parece ser uma opção terapêutica segura e eficaz no tratamento dos cistos da glândula de Bartholin.
Resumo:
The research on corporate social responsibility has been focused mainly on Anglo-Saxon countries and big companies. Most scholars agree there is a positive relationship between companies social and economic performance, however, this is not unanimous. Moreover,during economic downturns, companies struggle for survival and might consider corporate social responsibility efforts should be postponed. This research investigates if there is a positive relationship between social performance and key business results using a large sample of small and medium Portuguese companies over an extended period of time. The research results support the existence of valid positive relationships between companies’ social performance and key business results, confirming it does pay to invest in corporate social responsibility even in less favorable economic scenarios and for small and medium companies across all business sectors.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia de Materiais
Resumo:
Introdução:O envelhecimento influencia negativamente o controlo postural, diminuindo a capacidade de recuperar o equilíbrio após uma perturbação externa e consequentemente aumento do risco de queda nos adultos mais velhos Objetivo(s):Verificar a influência do treino do passo rápido voluntário nas estratégias de feedforward e feedback em adultos mais velhos aquando o stepping anterior e posterior, bem como o timing e sequência de ativação muscular, ajustes posturais antecipatórios (APA) e compensatórios (APC1,APC2), comprimento, latência, velocidade média (VM) do passo e estratégias de passo lateral Métodos:19 participantes foram distribuídos aleatoriamente por dois grupos, o grupo experimental (n=9) e o grupo controlo (n=10), estudo randomizado controlado. Ambos foram submetidos a um protocolo de exercício físico durante 3 meses, 2 vezes/semana. Adicionalmente o grupo experimental (GE) realizou o treino do passo rápido voluntário bilateral nas várias direções. A resposta a um desequilíbrio postural em vários sentidos e consequente resposta de stepping anterior, posterior ou lateral foram avaliadas por meio de eletromiografia de superfície e por um sistema de imagem 3D Resultados: Na variação entre o momento inicial e final (M0-M1), durante o stepping anterior o GE, comparativamente ao GC, aumentou e diminuiu significativamente o timing do RF ipsilateral e GemM, respetivamente. Na variação do timing dos músculos BF ipsilateral e contralateral e TA contralateral, o GE diminuiu significativamente menos do que o GC. O GE aumentou significativamente o comprimento do stepping posterior do que o GC. Quanto à variação da latência observou-se que o GE aumentou significativamente do que o GC nos dois steppings. Na variação da VM, do stepping anterior, o GE diminuiu significativamente mais do que GC. O GE aumentou e diminuiu significativamente os APAS e os APC1 do que o GC no stepping posterior e anterior, respetivamente. Verificou-se que a estratégia mais frequente nos dois momentos e grupos foi a estratégia de stepping lateral direto Conclusão: A contínua prática do stepping rápido voluntário parece promover um melhor controlo postural sendo um importante exercício específico para prevenção de quedas