994 resultados para Uses (Law)
Resumo:
This study is a reflection on the Portuguese Framework Law on Social Economy, highlighting, from a critical point-of-view, its contribution to the explicit institutional and legal recognition of the social economy sector. It does so by defining the concept of social economy and listing the entities engaged in this sector, by defining its guiding principles and the mechanisms for its promotion and encouragement, and also by describing the creation of a tax and competition regime which will take into account its specificities. The setting up of this foundations of the social economy was based on the constitutional principle of protection of the social and cooperative sector, which substantiates the adoption of differentiating solutions in view of the positive discrimination of this sector.
Resumo:
Economical development has always been connected to the commercial exchanges between people, due to the necessity to suppress their needs. With the increasing growth of international business and more competitive and demanding markets, exportation has become an important first step to internationalisation. Unlike what happened in the past, companies must be aware that the enrolment in the current global market is risky and requires elaborated technical procedures. Internationalisation should not be treated as an isolated event of business management. The first part of this paper aims to understand the export process and fit it in the current stage of international trade, keeping in mind the framework of export under the customs law. Then, we tried to understand how Portuguese companies should face this process in their internationalisation strategy, and what skills organisations must acquire to be able to export competitively in the current scenario of globalisation. The investigation was based on interviews in companies that, through a process of internationalisation by exportation, have implemented themselves strongly in extern markets. This investigation allowed us to analyse the companies’ motivations to become international, as well as the selection criteria for the export destinations. It was also possible to identify the main obstacles to the internationalisation of Portuguese companies. We concluded that companies that choose exportation as a way to become international acquire specific skills that enable them to become competitive in international trade. However, studies have failed to answer the second initial question about whether the measures implemented by Customs potentiate exports.
Resumo:
Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.
Resumo:
Adhesively-bonded joints are extensively used in several fields of engineering. Cohesive Zone Models (CZM) have been used for the strength prediction of adhesive joints, as an add-in to Finite Element (FE) analyses that allows simulation of damage growth, by consideration of energetic principles. A useful feature of CZM is that different shapes can be developed for the cohesive laws, depending on the nature of the material or interface to be simulated, allowing an accurate strength prediction. This work studies the influence of the CZM shape (triangular, exponential or trapezoidal) used to model a thin adhesive layer in single-lap adhesive joints, for an estimation of its influence on the strength prediction under different material conditions. By performing this study, guidelines are provided on the possibility to use a CZM shape that may not be the most suited for a particular adhesive, but that may be more straightforward to use/implement and have less convergence problems (e.g. triangular shaped CZM), thus attaining the solution faster. The overall results showed that joints bonded with ductile adhesives are highly influenced by the CZM shape, and that the trapezoidal shape fits best the experimental data. Moreover, the smaller is the overlap length (LO), the greater is the influence of the CZM shape. On the other hand, the influence of the CZM shape can be neglected when using brittle adhesives, without compromising too much the accuracy of the strength predictions.
Resumo:
This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.
Resumo:
Power law PL and fractional calculus are two faces of phenomena with long memory behavior. This paper applies PL description to analyze different periods of the business cycle. With such purpose the evolution of ten important stock market indices DAX, Dow Jones, NASDAQ, Nikkei, NYSE, S&P500, SSEC, HSI, TWII, and BSE over time is studied. An evolutionary algorithm is used for the fitting of the PL parameters. It is observed that the PL curve fitting constitutes a good tool for revealing the signal main characteristics leading to the emergence of the global financial dynamic evolution.
Resumo:
The study analyzes the trend in frequency of adults who drive under the influence of alcohol in major Brazilian cities after the passing of laws, which prohibit drunk driving. Data from the Surveillance System for Risk and Protective Factors for Chronic Diseases by Telephone Survey (VIGITEL) between 2007 and 2013 were analyzed. The frequency of adults who drove after abusive alcohol consumption was reduced by 45.0% during this period (2.0% in 2007 to 1.1% in 2013). Between 2007 and 2008 (-0.5%) and between 2012 and 2013 (-0.5%), significant reductions were observed in the years immediately after the publication of these laws that prohibit drunk driving. These improvements towards the control of drunk driving show a change in the Brazilian population’s lifestyle.
Resumo:
Adhesive joints are largely employed nowadays as a fast and effective joining process. The respective techniques for strength prediction have also improved over the years. Cohesive Zone Models (CZM’s) coupled to Finite Element Method (FEM) analyses surpass the limitations of stress and fracture criteria and allow modelling damage. CZM’s require the energy release rates in tension (Gn) and shear (Gs) and respective fracture energies in tension (Gnc) and shear (Gsc). Additionally, the cohesive strengths (tn0 for tension and ts0 for shear) must also be defined. In this work, the influence of the CZM parameters of a triangular CZM used to model a thin adhesive layer is studied, to estimate their effect on the predictions. Some conclusions were drawn for the accuracy of the simulation results by variations of each one of these parameters.
Resumo:
The study of transient dynamical phenomena near bifurcation thresholds has attracted the interest of many researchers due to the relevance of bifurcations in different physical or biological systems. In the context of saddle-node bifurcations, where two or more fixed points collide annihilating each other, it is known that the dynamics can suffer the so-called delayed transition. This phenomenon emerges when the system spends a lot of time before reaching the remaining stable equilibrium, found after the bifurcation, because of the presence of a saddle-remnant in phase space. Some works have analytically tackled this phenomenon, especially in time-continuous dynamical systems, showing that the time delay, tau, scales according to an inverse square-root power law, tau similar to (mu-mu (c) )(-1/2), as the bifurcation parameter mu, is driven further away from its critical value, mu (c) . In this work, we first characterize analytically this scaling law using complex variable techniques for a family of one-dimensional maps, called the normal form for the saddle-node bifurcation. We then apply our general analytic results to a single-species ecological model with harvesting given by a unimodal map, characterizing the delayed transition and the scaling law arising due to the constant of harvesting. For both analyzed systems, we show that the numerical results are in perfect agreement with the analytical solutions we are providing. The procedure presented in this work can be used to characterize the scaling laws of one-dimensional discrete dynamical systems with saddle-node bifurcations.
Resumo:
XX Symposium of Brazilian Medicinal Plants & X International Congress of Ethnopharmacology. S. Paulo, Brasil.
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
Mestrado em Engenharia Mecânica
Resumo:
A presente dissertação tem como principal propósito avaliar o desempenho energético e a qualidade do ar interior do edifício principal do Parque Biológico de Vila Nova de Gaia (PBG). Para esse efeito, este estudo relaciona os termos definidos na legislação nacional em vigor até à presente data, e referentes a esta área de atuação, em particular, os presentes no SCE, RSECE, RCCTE e RSECE-QAI. Para avaliar o desempenho energético, procedeu-se numa primeira fase ao processo de auditoria no local e posteriormente à realização de uma simulação dinâmica detalhada, cuja modelação do edifício foi feita com recurso ao software DesignBuilder. Após a validação do modelo simulado, por verificação do desvio entre os consumos energéticos registados nas faturas e os calculados na simulação, igual a 5,97%, foi possível efetuar a desagregação dos consumos em percentagem pelos diferentes tipos de utilizações. Foi também possível determinar os IEE real e nominal, correspondendo a 29,9 e 41.3 kgep/m2.ano, respetivamente, constatando-se através dos mesmos que o edifício ficaria dispensado de implementar um plano de racionalização energética (PRE) e que a classe energética a atribuir é a C. Contudo, foram apresentadas algumas medidas de poupança de energia, de modo a melhorar a eficiência energética do edifício e reduzir a fatura associada. Destas destacam-se duas propostas, a primeira propõe a alteração do sistema de iluminação interior e exterior do edifício, conduzindo a uma redução no consumo de eletricidade de 47,5 MWh/ano, com um período de retorno de investimento de 3,5 anos. A segunda está relacionada com a alteração do sistema de produção de água quente para o aquecimento central, através do incremento de uma caldeira a lenha ao sistema atual, que prevê uma redução de 50 MWh no consumo de gás natural e um período de retorno de investimento de cerca de 4 anos. Na análise realizada à qualidade do ar interior (QAI), os parâmetros quantificados foram os exigidos legalmente, excetuando os microbiológicos. Deste modo, para os parâmetros físicos, temperatura e humidade relativa, obtiveram-se os resultados médios de 19,7ºC e 66,9%, respetivamente, ligeiramente abaixo do previsto na legislação (20,0ºC no período em que foi feita a medição, inverno). No que diz respeito aos parâmetros químicos, os valores médios registados para as concentrações de dióxido de carbono (CO2), monóxido de carbono (CO), ozono (O3), formaldeído (HCHO), partículas em suspensão (PM10) e radão, foram iguais a 580 ppm, 0,2 ppm, 0,06 ppm, 0,01 ppm, 0,07 mg/m3 e 196 Bq/m3, respetivamente, verificando-se que estão abaixo dos valores máximos de referência presentes no regulamento (984 ppm, 10,7 ppm, 0,10 ppm, 0,08 ppm, 0,15 mg/m3 e 400 Bq/m3). No entanto, o parâmetro relativo aos compostos orgânicos voláteis (COV) teve um valor médio igual a 0,84 ppm, bastante acima do valor máximo de referência (0,26 ppm). Neste caso, terá que ser realizada uma nova série de medições utilizando meios cromatográficos, para avaliar qual(ais) são o(s) agente(s) poluidor(es), de modo a eliminar ou atenuar as fontes de emissão.
Resumo:
ABSTRACT - The authors’ main purpose is to present ideas on defining Health Law by highlighting the particularities of the field of Health Law as well as of the teaching of this legal branch, hoping to contribute to the maturity and academic recognition of Health Law, not only as a very rich legal field but also as a powerful social instrument in the fulfillment of fundamental human rights. The authors defend that Health Law has several characteristics that distinguish it from traditional branches of law such as its complexity and multidisciplinary nature. The study of Health Law normally covers issues such as access to care, health systems organization, patients’ rights, health professionals’ rights and duties, strict liability, healthcare contracts between institutions and professionals, medical data protection and confidentiality, informed consent and professional secrecy, crossing different legal fields including administrative, antitrust, constitutional, contract, corporate, criminal, environmental, food and drug, intellectual property, insurance, international and supranational, labor/employment, property, taxation, and tort law. This is one of the reasons why teaching Health Law presents a challenge to the teacher, which will have to find the programs, content and methods appropriate to the profile of recipients which are normally non jurists and the needs of a multidisciplinary curricula. By describing academic definitions of Health Law as analogous to Edgewood, a fiction house which has a different architectural style in each of its walls, the authors try to describe which elements should compose a more comprehensive definition. In this article Biolaw, Bioethics and Human Rights are defined as complements to a definition of Health Law: Biolaw because it is the legal field that treats the social consequences that arise from technological advances in health and life sciences; Bioethics which evolutions normally influence the shape of the legal framework of Health; and, finally Human Rights theory and declarations are outlined as having always been historically linked to medicine and health, being the umbrella that must cover all the issues raised in the area of Health Law. To complete this brief incursion on the definition on Health Law the authors end by giving note of the complex relations between this field of Law and Public Health. Dealing more specifically on laws adopted by governments to provide important health services and regulate industries and individual conduct that affect the health of the populations, this aspect of Health Law requires special attention to avoid an imbalance between public powers and individual freedoms. The authors conclude that public trust in any health system is essentially sustained by developing health structures which are consistent with essential fundamental rights, such as the universal right to access health care, and that the study of Health Law can contribute with important insights into both health structures and fundamental rights in order to foster a health system that respects the Rule of Law.-------------------------- RESUMO – O objectivo principal dos autores é apresentar ideias sobre a definição de Direito da Saúde, destacando as particularidades desta área do direito, bem como do ensino deste ramo jurídico, na esperança de contribuir para a maturidade e para o reconhecimento académico do mesmo, não só como um campo juridicamente muito rico, mas, também, como um poderoso instrumento social no cumprimento dos direitos humanos fundamentais. Os autores defendem que o Direito da Saúde tem diversas características que o distinguem dos ramos tradicionais do direito, como a sua complexidade e natureza multidisciplinar. O estudo do Direito da Saúde abrangendo normalmente questões como o acesso aos cuidados, a organização dos sistemas de saúde, os direitos e deveres dos doentes e dos profissionais de saúde, a responsabilidade civil, os contratos entre instituições de saúde e profissionais, a protecção e a confidencialidade de dados clínicos, o consentimento informado e o sigilo profissional, implica uma abordagem transversal de diferentes áreas legais, incluindo os Direitos contratual, administrativo, antitrust, constitucional, empresarial, penal, ambiental, alimentar, farmacêutico, da propriedade intelectual, dos seguros, internacional e supranacional, trabalho, fiscal e penal. Esta é uma das razões pelas quais o ensino do Direito da Saúde representa um desafio para o professor, que terá de encontrar os programas, conteúdos e métodos adequados ao perfil dos destinatários, que são normalmente não juristas e às necessidades de um currículo multidisciplinar. Ao descrever as várias definições académicas de Direito da Saúde como análogas a Edgewood, uma casa de ficção que apresenta um estilo arquitectónico diferente em cada uma de suas paredes, os autores tentam encontrar os elementos que deveriam compor uma definição mais abrangente. No artigo, Biodireito, Bioética e Direitos Humanos são descritos como complementos de uma definição de Direito da Saúde: o Biodireito, dado que é o campo jurídico que trata as consequências sociais que surgem dos avanços tecnológicos na área da saúde e das ciências da vida; a Bioética cujas evoluções influenciam normalmente o quadro jurídico da Saúde; e, por fim, a teoria dos Direitos Humanos e as suas declarações as quais têm estado sempre historicamente ligadas à medicina e à saúde, devendo funcionar como pano de fundo de todas as questões levantadas na área do Direito da Saúde. Para finalizar a sua breve incursão sobre a definição de Direito da Saúde, os autores dão ainda nota das complexas relações entre este último e a Saúde Pública, onde se tratam mais especificamente as leis aprovadas pelos governos para regular os serviços de saúde, as indústrias e as condutas individuais que afectam a saúde das populações, aspecto do Direito da Saúde que requer uma atenção especial para evitar um desequilíbrio entre os poderes públicos e as liberdades individuais. Os autores concluem afirmando que a confiança do público em qualquer sistema de saúde é, essencialmente, sustentada pelo desenvolvimento de estruturas de saúde que sejam consistentes com o direito constitucional da saúde, tais como o direito universal ao acesso a cuidados de saúde, e que o estudo do Direito da Saúde pode contribuir com elementos
Resumo:
Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.