883 resultados para non separable data
Resumo:
The main goal of the present work is the use of mineralogical data corresponding to sediment fine fractions (silt and clay) of Quaternary littoral deposits for the definition of a more detailed vertical zonography and to discriminate the most significant morphoclimatic changes concerned with sediment source areas and sediment deposition areas. The analysis of the available mineralogical data reveals a vertical evolution of the mineral composition. The following aspects deserve particular reference: 1) fine fractions (<38 nm) are composed of quartz and phyllosilicates associated to feldspars, prevailing over other minerals; however in certain sections iron hydroxides and evaporitic minerals occur in significant amounts; 2) clay fractions (<2 nm) show a general prevalence of illite associated with kaolinite and oscillations, in relative terms, of kaolinite and illite contents. Qualitative and quantitative lateral and vertical variations of clay and non clay minerals allow the discrimination of sedimentary sequences and the establishment of the ritmicity and periodicity of the morphoclimatic Quaternary episodes that occurred in the Cortegaça and Maceda beaches. To each one of the sedimentary sequences corresponds, in a first stage, a littoral environment that increasingly became more continental. Climate would be mild to cold, sometimes with humidity - aridity oscillations. Warmer and moister episodes alternated with cooler and dryer ones.
Resumo:
Eight depositional sequences (DS) delimited by regional disconformities had been recognized in the Miocene of Lisbon and Setúbal Peninsula areas. In the case of the western coast of the Setúbal Peninsula, outcrops consisting of Lower Burdigalian to Lower Tortonian sediments were studied. The stratigraphic zonography and the environmental considerations are mainly supported on data concerning to foraminifera, ostracoda, vertebrates and palynomorphs. The first mineralogical and geochemical data determined for Foz da Fonte, Penedo Sul and Penedo Norte sedimentary sequences are presented. These analytical data mainly correspond to the sediments' fine fractions. Mineralogical data are based on X-ray diffraction (XRD), carried out on both the less than 38 nm and 2 nm fractions. Qualitative and semi-quantitative determinations of clay and non-clay minerals were obtained for both fractions. The clay minerals assemblages complete the lithostratigraphic and paleoenvironmental data obtained by stratigraphic and palaeontological studies. Some palaeomagnetic and isotopic data are discussed and correlated with the mineralogical data. Multivariate data analysis (Principal Components Analysis) of the mineralogical data was carried out using both R-mode and Q-mode factor analysis.
Resumo:
Glucose monitoring in vivo is a crucial issue for gaining new understanding of diabetes. Glucose binding protein (GBP) fused to two fluorescent indicator proteins (FLIP) was used in the present study such as FLIP-glu- 3.2 mM. Recombinant Escherichia coli whole-cells containing genetically encoded nanosensors as well as cell-free extracts were immobilized either on inner epidermis of onion bulb scale or on 96-well microtiter plates in the presence of glutaraldehyde. Glucose monitoring was carried out by Förster Resonance Energy Transfer (FRET) analysis due the cyano and yellow fluorescent proteins (ECFP and EYFP) immobilized in both these supports. The recovery of these immobilized FLIP nanosensors compared with the free whole-cells and cell-free extract was in the range of 50–90%. Moreover, the data revealed that these FLIP nanosensors can be immobilized in such solid supports with retention of their biological activity. Glucose assay was devised by FRET analysis by using these nanosensors in real samples which detected glucose in the linear range of 0–24 mM with a limit of detection of 0.11 mM glucose. On the other hand, storage and operational stability studies revealed that they are very stable and can be re-used several times (i.e. at least 20 times) without any significant loss of FRET signal. To author's knowledge, this is the first report on the use of such immobilization supports for whole-cells and cell-free extract containing FLIP nanosensor for glucose assay. On the other hand, this is a novel and cheap high throughput method for glucose assay.
Resumo:
It was reevaluated a reduced schedule for anti-rabies post-exposure immunization with newborn mice nervous tissue vaccine (Fuenzalida 8c Palacios) in a group of 30 non exposed volunteers. The vaccine was administered by intramuscular injections on days zero, 2, 4, 16 and 27, in the deltoid area. Antibody levels were determinated by a simplified serum neutralization microtest on days zero, 16 and 37. On days 16 and 37 the antibody levels of the whole group was >0.5 IU/ml and >1.0 IU/ml, respectively. The cell mediated immunity was precociously detected (on day 4) by the delayed type hipersensitivity skin test. Our results show that this reduced schedule elicited an early and effective humoral and cellular immune response. However it is necessary other studies with larger groups of vaccinees in order to obtain definitive conclusion.
Resumo:
Trabalho final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações
Resumo:
Wireless communications had a great development in the last years and nowadays they are present everywhere, public and private, being increasingly used for different applications. Their application in the business of sports events as a means to improve the experience of the fans at the games is becoming essential, such as sharing messages and multimedia material on social networks. In the stadiums, given the high density of people, the wireless networks require very large data capacity. Hence radio coverage employing many small sized sectors is unavoidable. In this paper, an antenna is designed to operate in the Wi-Fi 5GHz frequency band, with a directive radiation pattern suitable to this kind of applications. Furthermore, despite the large bandwidth and low losses, this antenna has been developed using low cost, off-the-shelf materials without sacrificing quality or performance, essential to mass production. © 2015 EurAAP.
Resumo:
The prevalence of rubella antibodies was evaluated through a ramdom Seroepidemiological survey in 1400 blood samples of 2-14 year old children and in 329 samples of umbilical cord serum. Rubella IgG antibodies were detected by ELISA, and the sera were collected in 1987, five years before the mass vaccination campaign with measles-mumps-rubella vaccine carried out in the city of São Paulo in 1992. A significant increase in prevalence of rubella infection was observed after 6 years of age, and 77% of the individuals aged from 15 to 19 years had detectable rubella antibodies. However, the seroprevalence rose to 90.5% (171/189) in cord serum samples from children whose mothers were 20 to 29 years old, and reached 95.6% in newborns of mothers who were 30 to 34 years old, indicating that a large number of women are infected during childbearing years. This study confirms that rubella infection represents an important Public Health problem in São Paulo city. The data on the seroprevalence of rubella antibodies before the mass vaccination campaign reflects the baseline immunological status of this population before any intervention and should be used to design an adequate vaccination strategy and to assess the Seroepidemiological impact of this intervention.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
Demand response has gained increasing importance in the context of competitive electricity markets and smart grid environments. In addition to the importance that has been given to the development of business models for integrating demand response, several methods have been developed to evaluate the consumers’ performance after the participation in a demand response event. The present paper uses those performance evaluation methods, namely customer baseline load calculation methods, to determine the expected consumption in each period of the consumer historic data. In the cases in which there is a certain difference between the actual consumption and the estimated consumption, the consumer is identified as a potential cause of non-technical losses. A case study demonstrates the application of the proposed method to real consumption data.
Resumo:
Recent embedded processor architectures containing multiple heterogeneous cores and non-coherent caches renewed attention to the use of Software Transactional Memory (STM) as a building block for developing parallel applications. STM promises to ease concurrent and parallel software development, but relies on the possibility of abort conflicting transactions to maintain data consistency, which in turns affects the execution time of tasks carrying transactions. Because of this fact the timing behaviour of the task set may not be predictable, thus it is crucial to limit the execution time overheads resulting from aborts. In this paper we formalise a FIFO-based algorithm to order the sequence of commits of concurrent transactions. Then, we propose and evaluate two non-preemptive and one SRP-based fully-preemptive scheduling strategies, in order to avoid transaction starvation.
Resumo:
The occurrence of seven pharmaceuticals and two metabolites belonging to non-steroidal anti-inflammatory drugs and analgesics therapeutic classes was studied in seawaters. A total of 101 samples covering fourteen beaches and five cities were evaluated in order to assess the spatial distribution of pharmaceuticals among north Portuguese coast. Seawaters were selected in order to embrace different bathing water quality (excellent, good and sufficient). Acetaminophen, ketoprofen and the metabolite hydroxyibuprofen were detected in all the seawater samples at maximum concentrations of 584, 89.7 and 287 ng L− 1, respectively. Carboxyibuprofen had the highest seawater concentration (1227 ng L− 1). The temporal distribution of the selected pharmaceuticals during the bathing season showed that, in general, higher concentrations were detected in August and September. The environmental risk posed by the pharmaceuticals detected in seawaters towards different trophic levels (fish, daphnids and algae) was also assessed. Only diclofenac showed hazard quotients above one for fish, representing a potential risk for aquatic organisms. These results were observed in seawaters classified as excellent bathing water. Additional data is needed in order to support the identification and prioritization of risks posed by pharmaceuticals in marine environment.
Resumo:
In South Brazil the circulation of two HIV-1 subtypes with different characteristics represents an important scenario for the study of the impact of HIV-1 diversity on the evolution of the HIV-1 epidemic and AIDS disease. HIV-1 B, the predominant variant in industrialized countries and HIV-1 C, the most prevalent subtype in areas with rapid epidemic growth, are implicated in most infections. We evaluated blood samples from 128 antiretroviral (ARV) naïve patients recruited at entry to the largest HIV outpatient service in Porto Alegre. Based on partial pol region sequencing, HIV-1 C was observed in 29%, HIV-1 B in 22.6% and, the recently identified CRF31_BC, in 23.4% of 128 volunteers. Other variants were HIV-1 F in 10% and other mosaics in 5.5%. In order to evaluate the association of socio-behavioral characteristics and HIV-1 subtypes, interviews and laboratory evaluation were performed at entry. Our data suggest an established epidemic of the three major variants, without any evidence of partitioning in either of the subgroups analyzed. However, anal sex practices were associated with subtype B, which could indicate a greater transmissibility of non-B variants by vaginal intercourse. This study provides baseline information for epidemiologic surveillance of the changes of the molecular characteristics of HIV-1 epidemics in this region.
Resumo:
The last three decades have seen quite dramatic changes the way we modeled time dependent data. Linear processes have been in the center stage in modeling time series. As far as the second order properties are concerned, the theory and the methodology are very adequate.However, there are more and more evidences that linear models are not sufficiently flexible and rich enough for modeling purposes and that failure to account for non-linearities can be very misleading and have undesired consequences.
Resumo:
Scientific literature has strengthened the perpetuation of inequality factors in the labour market based on gender, despite the on-going endeavour of various political bodies and legal norms against the vertical and horizontal segregation of women. National and European statistical data shows the relevance and timeless features of theories of market segmentation associated with the labour market dating back to the 70’s of the 20th century. Hence, the European Community considers as a priority in the Europe 2020 strategy, the definition of “policies to promote gender equality […] to increase labour force participation thus adding to growth and social cohesion”. If we consider that on the one hand, social economy is fairly recognised to be equated with market actors and the State for its economic and social role in tackling the current crisis, and on the other hand, that the ideals of the sector, systematised in the “Framework Law of Social Economy” (Law no. 30/2013 8th of May), particularly in article 5 proposing “the respect for the values […] of equality and non-discrimination […], justice and equity […]”, we aim to reflect on indicators that uncover a vertical and horizontal segregation in the labour market. Departing from a mixed methodological approach (extensive and intensive), subject to the topic of "Social Entrepreneurship in Portugal" in social economy organisations, we detect very high rates of employment feminisation, with a ratio of 1 man (23%) for every 3 women (77%). Women are mainly earmarked for technical and operational activities, arising from the privileged intervention areas, namely education, training, health, elderly, families, poverty, ultimately being underrepresented in statutory boards and, as such, far removed from deliberations and strategic resolutions. This is particularly visible in the existing hierarchy of functions and management practices of the responsibility of male members. Thus, it seems easily verified that the sector is travelling away from the ideals of justice and social equity, which can crystallise the "non-place" of women in the definition of a strategic direction of social economy and in the most invisible/private “place” of the organisational setting.
Resumo:
Introduction & Objectives: Several factors may influence the decision to pursue nonsurgical modalities for the treatment of non-melanoma skin cancer. Topical photodynamic therapy (PDT) is a non-invasive alternative treatment reported to have a high efficacy when using standardized protocols in Bowen’s disease (BD), superficial basal cell carcinoma (BCC) and in thin nodular BCC. However, long-term recurrence studies are lacking. The aim of this study was to evaluate the long-term efficacy of PDT with topical methylaminolevulinate (MAL) for the treatment of BD and BCC in a dermato-oncology department. Materials & Methods: All patients with the diagnosis of BD or BCC, treated with MAL-PDT from the years 2004 to 2008, were enrolled. Treatment protocol included two MAL-PDT sessions one week apart repeated at three months when incomplete response, using a red light dose of 37-40 J/cm2 and an exposure time of 8’20’’. Clinical records were retrospectively reviewed, and data regarding age, sex, tumour location, size, treatment outcomes and recurrence were registered. Descriptive analysis was performed using chi square tests, followed by survival analysis with the Kaplan-Meier and Cox regression models. Results: Sixty-eight patients (median age 71.0 years, P25;P75=30;92) with a total of 78 tumours (31 BD, 45 superficial BCC, 2 nodular BCC) and a median tumour size of 5 cm2 were treated. Overall, the median follow-up period was 43.5 months (P25;P75=0;100), and a total recurrence rate of 33.8% was observed (24.4 % for BCC vs. 45.2% for BD). Estimated recurrence rates for BCC and BD were 5.0% vs. 7.4% at 6 months, 23.4% vs. 27.9% at 12 months, and 30.0% vs. 72.4% at 60 months. Both age and diagnosis were independent prognostic factors for recurrence, with significantly higher estimated recurrence rates in patients with BD (p=0.0036) or younger than 58 years old (p=0.039). The risk of recurrence (hazard ratio) was 2.4 times higher in patients with BD compared to superficial BCC (95% CI:1.1-5.3; p=0.033), and 2.8 times higher in patients younger than 58 years old (95% CI:1.2-6.5; p=0.02). Conclusions: In the studied population, estimated recurrence rates are higher than those expected from available literature, possibly due to a longer follow-up period. To the authors’ knowledge there is only one other study with a similar follow-up period, regarding BCC solely. BD, as an in situ squamous cell carcinoma, has a higher tendency to recur than superficial BCC. Despite greater cosmesis, PDT might no be the best treatment option for young patients considering their higher risk of recurrence.