875 resultados para Defining Sets
Resumo:
The rapidly increasing computing power, available storage and communication capabilities of mobile devices makes it possible to start processing and storing data locally, rather than offloading it to remote servers; allowing scenarios of mobile clouds without infrastructure dependency. We can now aim at connecting neighboring mobile devices, creating a local mobile cloud that provides storage and computing services on local generated data. In this paper, we describe an early overview of a distributed mobile system that allows accessing and processing of data distributed across mobile devices without an external communication infrastructure. Copyright © 2015 ICST.
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
The increasing importance of the integration of distributed generation and demand response in the power systems operation and planning, namely at lower voltage levels of distribution networks and in the competitive environment of electricity markets, leads us to the concept of smart grids. In both traditional and smart grid operation, non-technical losses are a great economic concern, which can be addressed. In this context, the ELECON project addresses the use of demand response contributions to the identification of non-technical losses. The present paper proposes a methodology to be used by Virtual Power Players (VPPs), which are entities able to aggregate distributed small-size resources, aiming to define the best electricity tariffs for several, clusters of consumers. A case study based on real consumption data demonstrates the application of the proposed methodology.
Resumo:
Artificial Intelligence has been applied to dynamic games for many years. The ultimate goal is creating responses in virtual entities that display human-like reasoning in the definition of their behaviors. However, virtual entities that can be mistaken for real persons are yet very far from being fully achieved. This paper presents an adaptive learning based methodology for the definition of players’ profiles, with the purpose of supporting decisions of virtual entities. The proposed methodology is based on reinforcement learning algorithms, which are responsible for choosing, along the time, with the gathering of experience, the most appropriate from a set of different learning approaches. These learning approaches have very distinct natures, from mathematical to artificial intelligence and data analysis methodologies, so that the methodology is prepared for very distinct situations. This way it is equipped with a variety of tools that individually can be useful for each encountered situation. The proposed methodology is tested firstly on two simpler computer versus human player games: the rock-paper-scissors game, and a penalty-shootout simulation. Finally, the methodology is applied to the definition of action profiles of electricity market players; players that compete in a dynamic game-wise environment, in which the main goal is the achievement of the highest possible profits in the market.
Resumo:
The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.
Resumo:
ABSTRACT - The authors’ main purpose is to present ideas on defining Health Law by highlighting the particularities of the field of Health Law as well as of the teaching of this legal branch, hoping to contribute to the maturity and academic recognition of Health Law, not only as a very rich legal field but also as a powerful social instrument in the fulfillment of fundamental human rights. The authors defend that Health Law has several characteristics that distinguish it from traditional branches of law such as its complexity and multidisciplinary nature. The study of Health Law normally covers issues such as access to care, health systems organization, patients’ rights, health professionals’ rights and duties, strict liability, healthcare contracts between institutions and professionals, medical data protection and confidentiality, informed consent and professional secrecy, crossing different legal fields including administrative, antitrust, constitutional, contract, corporate, criminal, environmental, food and drug, intellectual property, insurance, international and supranational, labor/employment, property, taxation, and tort law. This is one of the reasons why teaching Health Law presents a challenge to the teacher, which will have to find the programs, content and methods appropriate to the profile of recipients which are normally non jurists and the needs of a multidisciplinary curricula. By describing academic definitions of Health Law as analogous to Edgewood, a fiction house which has a different architectural style in each of its walls, the authors try to describe which elements should compose a more comprehensive definition. In this article Biolaw, Bioethics and Human Rights are defined as complements to a definition of Health Law: Biolaw because it is the legal field that treats the social consequences that arise from technological advances in health and life sciences; Bioethics which evolutions normally influence the shape of the legal framework of Health; and, finally Human Rights theory and declarations are outlined as having always been historically linked to medicine and health, being the umbrella that must cover all the issues raised in the area of Health Law. To complete this brief incursion on the definition on Health Law the authors end by giving note of the complex relations between this field of Law and Public Health. Dealing more specifically on laws adopted by governments to provide important health services and regulate industries and individual conduct that affect the health of the populations, this aspect of Health Law requires special attention to avoid an imbalance between public powers and individual freedoms. The authors conclude that public trust in any health system is essentially sustained by developing health structures which are consistent with essential fundamental rights, such as the universal right to access health care, and that the study of Health Law can contribute with important insights into both health structures and fundamental rights in order to foster a health system that respects the Rule of Law.-------------------------- RESUMO – O objectivo principal dos autores é apresentar ideias sobre a definição de Direito da Saúde, destacando as particularidades desta área do direito, bem como do ensino deste ramo jurídico, na esperança de contribuir para a maturidade e para o reconhecimento académico do mesmo, não só como um campo juridicamente muito rico, mas, também, como um poderoso instrumento social no cumprimento dos direitos humanos fundamentais. Os autores defendem que o Direito da Saúde tem diversas características que o distinguem dos ramos tradicionais do direito, como a sua complexidade e natureza multidisciplinar. O estudo do Direito da Saúde abrangendo normalmente questões como o acesso aos cuidados, a organização dos sistemas de saúde, os direitos e deveres dos doentes e dos profissionais de saúde, a responsabilidade civil, os contratos entre instituições de saúde e profissionais, a protecção e a confidencialidade de dados clínicos, o consentimento informado e o sigilo profissional, implica uma abordagem transversal de diferentes áreas legais, incluindo os Direitos contratual, administrativo, antitrust, constitucional, empresarial, penal, ambiental, alimentar, farmacêutico, da propriedade intelectual, dos seguros, internacional e supranacional, trabalho, fiscal e penal. Esta é uma das razões pelas quais o ensino do Direito da Saúde representa um desafio para o professor, que terá de encontrar os programas, conteúdos e métodos adequados ao perfil dos destinatários, que são normalmente não juristas e às necessidades de um currículo multidisciplinar. Ao descrever as várias definições académicas de Direito da Saúde como análogas a Edgewood, uma casa de ficção que apresenta um estilo arquitectónico diferente em cada uma de suas paredes, os autores tentam encontrar os elementos que deveriam compor uma definição mais abrangente. No artigo, Biodireito, Bioética e Direitos Humanos são descritos como complementos de uma definição de Direito da Saúde: o Biodireito, dado que é o campo jurídico que trata as consequências sociais que surgem dos avanços tecnológicos na área da saúde e das ciências da vida; a Bioética cujas evoluções influenciam normalmente o quadro jurídico da Saúde; e, por fim, a teoria dos Direitos Humanos e as suas declarações as quais têm estado sempre historicamente ligadas à medicina e à saúde, devendo funcionar como pano de fundo de todas as questões levantadas na área do Direito da Saúde. Para finalizar a sua breve incursão sobre a definição de Direito da Saúde, os autores dão ainda nota das complexas relações entre este último e a Saúde Pública, onde se tratam mais especificamente as leis aprovadas pelos governos para regular os serviços de saúde, as indústrias e as condutas individuais que afectam a saúde das populações, aspecto do Direito da Saúde que requer uma atenção especial para evitar um desequilíbrio entre os poderes públicos e as liberdades individuais. Os autores concluem afirmando que a confiança do público em qualquer sistema de saúde é, essencialmente, sustentada pelo desenvolvimento de estruturas de saúde que sejam consistentes com o direito constitucional da saúde, tais como o direito universal ao acesso a cuidados de saúde, e que o estudo do Direito da Saúde pode contribuir com elementos
Resumo:
Recently, reactivation of Chagas disease (meningoencephalitis and/or myocarditis) was included in the list of AIDS-defining illnesses in Brazil. We report a case of a 52-year-old patient with no history of previous disease who presented acute meningoencephalitis. Direct examination of blood and cerebrospinal fluid (CSF) showed Trypanosoma cruzi. CSF culture confirmed the diagnosis. Serological assays for T. cruzi and human immunodeficiency virus (HIV) were positive. Despite treatment with benznidazol and supportive measures, the patient died 24 hours after hospital admission. In endemic areas, reactivation of Chagas disease should always be considered in the differential diagnosis of meningoencephalitis among HIV-infected patients, and its presence is indicative of AIDS.
Resumo:
Smooth muscle neoplasms are more frequent in human immunodeficiency infected children than in HIV seropositive adults. Endobronchial leiomyoma is a rare benign tumor in HIV infected adult patients. Epstein-Barr virus (EBV) has been implicated in the pathogenesis of these tumors. Here we describe an adult patient with HIV infection with atelectasis of the left upper pulmonary lobe as the first clinical expression of an intrabronchial leiomyoma. In this case, we can not show the association with EBV. Our report suggests that smooth muscle tumors as leiomyoma should be included in the differential diagnosis of endobronchial masses in AIDS patients.
Resumo:
Squamous anal cell carcinoma is a rare malignancy that represents the 1.5% to 2% of all the lower digestive tract cancers. However, an increased incidence of invasive anal carcinoma is observed in HIV-seropositive population since the widespread of highly active antiretroviral therapy. Human papillomavirus is strongly associated with the pathogenesis of anal cancer. Anal intercourse and a high number of sexual partners appear to be risk factors to develop anal cancer in both sexes. Anal pain, bleeding and a palpable lesion in the anal canal are the most common clinical features. Endo-anal ultrasound is the best diagnosis method to evaluate the tumor size, the tumor extension and the infiltration of the sphincter muscle complex. Chemoradiotherapy plus antiretroviral therapy are the recommended treatments for all stages of localized squamous cell carcinoma of the anal canal in HIV-seropositive patients because of its high rate of cure. Here we present an HIV patient who developed a carcinoma of the anal canal after a long time of HIV infection under highly active antiretroviral therapy with a good virological and immunological response.
Resumo:
Lymphomas of the oral cavity are a rare complication of advanced HIV/AIDS disease. The clinical appearance of these neoplasms includes masses or ulcerative lesions that involve the oral soft tissue and the jaw as the predominant manifestation. We report the case of a patient with AIDS who developed diffuse large B-cell non-Hodgkins lymphoma of the oral cavity during highly active antiretroviral therapy, with undetectable plasma viral load and immune reconstitution.
Resumo:
We present the case of a 31-year-old man with acute manifestation of progressive multifocal leukoencephalopathy (PML) as an AIDS-defining disease. The patient presented with a three-day history of neurological disease, brain lesions without mass effect or contrast uptake and a slightly increased protein concentration in cerebrospinal fluid. A serological test for HIV was positive and the CD4+ T-cell count was 427/mm³. Histological examination of the brain tissue revealed abnormalities compatible with PML. The disease progressed despite antiretroviral therapy, and the patient died three months later. PML remains an important cause of morbidity and mortality among HIV-infected patients.
Resumo:
This project aims to delineate recovery strategies for a Portuguese Bank, as a way to increase its preparedness towards unexpected disruptive events, thus avoiding an operational crisis escalation. For this purpose, Business Continuity material was studied, a risk assessment performed, a business impact analysis executed and new strategic framework for selecting strategies adopted. In the end, a set of recovery strategies were chosen that better represented the Bank’s appetite for risk, and recommendations given for future improvements.
Resumo:
IntroductionThe objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources.Methodsthe trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters.ResultsOf the 66 cases detected, only one (1.5%) was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education.ConclusionsThe recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.
Resumo:
The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.