970 resultados para mixing of states


Relevância:

80.00% 80.00%

Publicador:

Resumo:

O presente artigo exp??e a vis??o que as elites sociais e institucionais t??m sobre os Tribunais de Contas (TCs) subnacionais brasileiros. Realizada no bojo do processo de diagn??stico e reforma dos TCs, o Programa de Moderniza????o do Sistema de Controle Externo dos Estados e do Distrito Federal (Promoex), esta pesquisa revela quais s??o os problemas que afetam o desempenho administrativo e a legitimidade institucional desses ??rg??os fiscalizadores, bem como as suas qualidades, as quais podem ser utilizadas como motor de sua reformula????o. A partir da interpreta????o das opini??es dos atores entrevistados, a an??lise final procura revelar quais caminhos podem ser trilhados para se modernizarem os Tribunais de Contas subnacionais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta pesquisa teve a intenção de traçar e problematizar os atravessamentos percebidos nos encontros com profissionais do Creas (Centro de Referência Especializada da Assistência Social), que atuam com medida socioeducativa em meio aberto ou Liberdade Assistida. Através dos diálogos realizados, tanto por meio de entrevistas individuais ou em grupo, a intenção foi de localizar, intensificar entrelaçamentos, linhas presentes nas diferentes facetas e momentos que esse campo apresenta. Partindo dessa intenção, o desafio que se colocou foi traçar mapas, de forma a dar corpo a ressonâncias com diferentes tempos e territórios. Constatou-se, com o auxílio de alguns autores, como Foucault, Costa e Agamben, a presença de emaranhados de questões que se entrelaçam com várias outras áreas e esferas de atuação, ou seja, áreas que não são não específicas desse campo, mas que envolvem outros profissionais, programas e propostas de serviços Estatais. Tudo isso se encontrou nas falas dos técnicos sobre as ações diárias, priorizando encontros com adolescentes em cumprimento de liberdade assistida e suas famílias, aliadas com referências dos conceitos como de governamentalidade, biopolítica, que permitiram deslocamentos de pontos de vistas cristalizados e problematizações de práticas e discursos que ali estão colocados. Para se construir um olhar crítico sobre esse campo, também foi feito um breve levantamento histórico sobre como alguns conceitos, como os de delinquência e família, são utilizados nas estratégias de intervenção governamental da população. A partir da construção de certos padrões de normalidade no cuidado com a infância ao longo da história das estratégias socioassistenciais do Brasil, se enfatizaram relações de desigualdade nos serviços e práticas contemporâneos. Tudo isso também permitiu questionar como ilegalidades participam das relações de Estado cada vez mais de forma consolidada, participando tanto da afirmação de práticas produtoras de aprisionamentos, como também possibilitando a percepção de diferentes tipos de relações que atuam na produção de outras formas de se viver Liberdade Assistida, sob discursos e práticas vigilância e controle.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este estudo teve como objetivo avaliar o potencial preservativo dos extrativos do cerne da madeira de teca (Tectona grandis) e a capacidade dos mesmos na mudança de coloração de madeiras claras. Para tanto, os resíduos gerados no processamento mecânico do cerne da madeira de teca com 20 anos de idade foram coletados e utilizados para realização de extrações. Para avaliar a influência dos extrativos de teca na cor e resistência natural da madeira foi utilizado o alburno da madeira de teca com 10 anos, além da madeira de Pinus sp., em função de ser uma madeira de coloração clara e baixa resistência natural. Foram realizadas extrações em água quente e etanol absoluto. Para determinação da concentração das soluções de tratamento foi realizado um ensaio de toxidez ao fungo Postia placenta. Após definida a concentração, as soluções extraídas foram preparadas para a impregnação. Além disto, foi utilizada uma terceira solução, composta pela combinação das soluções extraídas em água quente e etanol absoluto. Para cada solução testada foi realizado o tratamento pelo método de célula-cheia (Bethell). Para testar a eficiência das soluções preparadas com extrativos de teca, foram realizados leituras colorimétricas e ensaios biológicos com fungos e térmitas xilófagos. A combinação dos extrativos testados promoveu um escurecimento e reduziu a desuniformidade da cor, fazendo com que as madeiras tratadas se aproximassem mais da cor da madeira de cerne do que das amostras sem tratamento das respectivas espécies. A solução de extrativos obtida em etanol absoluto e a combinação dos extrativos obtidos em água quente e etanol absoluto promoveram os melhores resultados na resistência da madeira tratada contra fungos e térmitas xilófagos, alterando significativamente a classe de resistência das respectivas espécies tratadas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last fifty years, Brazil began a rapid process of structural transformation, following the first stage of industrial development in the 1930s. Currently the country integrates the small group of countries which evolved from an initial peripheral and subordinate insertion dating back to the nineteenth century, part of the most dynamic segment of the semiperiphery. But this category, intermediate between the "maturity" and "backwardness", according Modernization theorists, or between the "center" and "periphery", as theorists of the Dependence defend, has undergone a process of overcoming considerable positive progress in the direction of the group of states that dominate the current world system. In this way, during the years 2003-2010, foreign policy, along with the formulation of a new regionalism as a strategy of global integration and a new ideal model of State, has been a key factor

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, with the use of technology and the Internet, education is undergoing significant changes, contemplating new ways of teaching and learning. One of the widely methods of teaching used to promote knowledge, consists in the use of virtual environments available in various formats, taking as example the teaching-learning platforms, which are available online. The Internet access and use of Laptops have created the technological conditions for teachers and students can benefit from the diversity of online information, communication, collaboration and sharing with others. The integration of Internet services in the teaching practices can provide thematic, social and digital enrichment for the agents involved. In this paper we will talk about the advantages of LMS (Learning Management Systems) such as Moodle, to support the presential lectures in higher education. We also will analyse its implications for student support and online interaction, leading educational agents to a mixing of different learning environments, where they can combine face-to-face instruction with computer-mediated instruction, blended-learning, and increases the options for better quality and quantity of human interaction in a learning environment. We also will present some tools traditionally used in online assessment and that are part of the functionalities of Moodle. These tools can provide interesting alternatives to promote a more significant learning and contribute to the development of flexible and customized models of an evaluation which we want to be more efficient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Our society relies on energy for most of its activities. One application domain inciding heavily on the energy budget regards the energy consumption in residential and non-residential buildings. The ever increasing needs for energy, resulting from the industrialization of developing countries and from the limited scalability of the traditional technologies for energy production, raises both problems and opportunities. The problems are related to the devastating effects of the greenhouse gases produced by the burning of oil and gas for energy production, and from the dependence of whole countries on companies providing gas and oil. The opportunities are mostly technological, since novel markets are opening for both energy production via renewable sources, and for innovations that can rationalize energy usage. An enticing research effort can be the mixing of these two aspects, by leveraging on ICT technologies to rationalize energy production, acquisition, and consumption. The ENCOURAGE project aims to develop embedded intelligence and integration technologies that will directly optimize energy use in buildings and enable active participation in the future smart grid environment.The primary application domains targeted by the ENCOURAGE project are non-residential buildings (e.g.: campuses) and residential buildings (e.g.: neighborhoods). The goal of the project is to achieve 20% of energy savings through the improved interoperability between various types of energy generation, consumption and storage devices; interbuilding energy exchange; and systematic performance monitoring.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O tratamento das águas residuais é uma matéria de extrema importância para o município da Póvoa de Varzim, não só por uma questão de saúde pública e conservação do meio ambiente como também pela vertente turística deste concelho, que tem na sua orla costeira seis praias às quais foram atribuídas bandeiras azuis pela sua qualidade. O concelho da Póvoa de Varzim engloba doze freguesias e possui quinze estações de tratamento de águas residuais (ETARs), sendo catorze delas compactas. O seu controlo é assegurado pela divisão de saneamento básico da câmara municipal da Póvoa de Varzim. O objetivo deste trabalho foi o diagnóstico de funcionamento das ETARs do município tendo em vista a identificação dos problemas existentes e a sua resolução/otimização. De forma a poder identificar o princípio de funcionamento e a presença de anomalias nas estações de tratamento, foram realizadas várias visitas a cada uma delas ao longo do período de estágio. A recolha de amostras para análises dos diferentes parâmetros foi feita por um funcionário e estas foram enviadas para o laboratório com parceria com a Câmara Municipal. Após uma extensa recolha de informação no local e de um estudo exaustivo de toda a documentação associada a cada ETAR concluiu-se que apenas quatro delas apresentavam problemas revelantes. As ETARs do parque industrial de Laúndos e do centro histórico de Rates apresentam caudais de admissão bastante elevados devido à descarga pontual de camiões cisterna o que faz com que o tratamento não seja eficaz. Como solução sugeriu-se a construção de um tanque de equalização em ambas as ETARs, com agitador e regulador de caudal, de forma a garantir, respetivamente, a mistura e uniformização das águas residuais domésticas e industriais e que apenas será bombeado o caudal adequado para tratamento. As ETARs da Incondave e das Fontaínhas apresentam sobretudo anomalias a nível do equipamento, o que leva a um mau desempenho da instalação. Aconselhou-se o conserto dos equipamentos danificados e uma inspeção mais frequente das instalações para que mal ocorra uma avaria, esta seja reparada o mais depressa possível. O estágio na câmara municipal da Póvoa de Varzim (CMPV) teve a duração de 10 meses, entre Outubro e Julho de 2012 e foi realizado no âmbito da disciplina de dissertação/ estágio do mestrado de tecnologias de proteção ambiental no Instituto Superior de Engenharia do Porto. Este estágio foi uma mais-valia para mim na medida em que pude consolidar os conhecimentos adquiridos ao longo de todo o meu percurso académico e conhecer a realidade do mercado de trabalho.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Peniche section has revealed moderately-to-well preserved calcareous nannofossil assemblages across the Pliensbachian/Toarcian boundary. This good record has allowed the proposition of a refined biostratigraphic scheme. The stage boundary, as defined by ammonites, is comprised within the NJ5b C. impontus (NW Europe; BOWN & COOPER, 1998) or the NJT5b L. sigillatus (Mediterranean Tethys; MATTIOLI & ERBA, 1999) nannofossil subzones. Since in the Lusitanian Basin a mixing of N- and S-Tethyan taxa is observed, both biozonation schemes can be applied. Some nannofossil events (mainly first occurrences) are observed earlier in Portugal than in other Tethyan settings. It is still unclear if these events are real first occurrences. A diversification phase occurred across the Pliensbachian/Toarcian boundary. This phase is well recorded at Peniche, where a change is observed passing from the Pliensbachian, when assemblages are dominated by muroliths, to the Toarcian showing assemblages where placoliths are abundant. A quantification of nannofossils per gram of rock shows that absolute abundances are the highest across the Pliensbachian/Toarcian boundary. Indeed, Peniche exhibits nannofossil abundances very high with respect to correlative levels in other Tethyan settings. The pelagic carbonate fraction (produced by nannofossils) is important in the marly hemi-couplets of Peniche. In some levels, nannofossils account for more than 50% of the total carbonate fraction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Química Sustentável

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This essay presents the European Arrest Warrant and its relationship with the principle of double criminality, which was abolished in 2002 with the new Framework Decision (FD). This instrument was essential to implement the principle of mutual recognition and strengthen the police and judicial cooperation in criminal matters in the newly created space of freedom, security and justice. It was urgent to create mechanisms to combat cross-border crime, that alone States have struggled to counter. An analysis of the FD No 2002/584/JHA is made. The execution of warrants and the non-mandatory and optional grounds of refusal are studied in detail. As it is the implementation issue. The role of mutual recognition in practice is studied as well. The procedure is to introduce the principle of double criminality, to explain the concept and its abolition, warning for the consequences derived from them, related to the principle of legality and fundamental rights. The analysis of the European Arrest Warrant in practice in Portugal and in comparison with other Member States allows the measurement of the consequences from the abolition of dual criminality and the position of States on this measure. With the abolition of double criminality, the cooperation in judicial and criminal matters departs from what was intended by the European Council of Tampere. And without cooperation, fundamental rights of citizens are unprotected, so the states have to adopt measures to remedy the "failures" of the European Law.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Along with the food and the comfort, safety has always been one of the human priorities. In pursuit of this objective, man developed self-preservation mechanisms, went to live in society and created rules to control the community life. In the West and in the late eighteenth century, with the creation of states as we know them today, the monopoly of security, among other powers, has been preserved untouched until the last quarter of this century. With the bankruptcy of the welfare state and the rise of the regulatory state, many of the essential tasks for the community have also been carried out by private companies or institutions, including education, health care and security. Although not easy, education and health care have been more opened to be managed by the private sector. Instead, the privatization of the security sector has seen much more resistance. Still, especially in the West, the states have delegated some of the security competences to private companies. Portugal is no exception to the rule and, after a few years of unregulated activity, in 1982 was published the first law regulating the private security. After the initial stages of development (evolution and maturation), which lasted until the early years of the 2000‘s, the private security now seems to have reached maturity. Today, now with a new legal system, composed by Law no. 34/2013, of 16 may, its regulations and complementary legislation, now private security encompasses other activities and competences - becoming, an increasingly complement to public safety. It has also increased the pre-requisites and control mechanisms for private security companies, and strengthened the rules that limit their scope of activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.