987 resultados para evolved transforms
Resumo:
what was silent will speak, what is closed will open and will take on a voice Paul Virilio The fundamental problem in dealing with the digital is that we are forced to contend with a fundamental deconstruction of form. A deconstruction that renders our content and practice into a single state that can be openly and easily manipulated, reimagined and mashed together in rapid time to create completely unique artefacts and potentially unwranglable jumbles of data. Once our work is essentially broken down into this series of number sequences, (or bytes), our sound, images, movies and documents – our memory files - we are left with nothing but choice….and this is the key concern. This absence of form transforms our work into new collections and poses unique challenges for the artist seeking opportunities to exploit the potential of digital deconstruction. It is through this struggle with the absent form that we are able to thoroughly explore the latent potential of content, exploit modern abstractions of time and devise approaches within our practice that actively deal with the digital as an essential matter of course.
Resumo:
Proteases with important roles for bacterial pathogens which specifically reside within intracellular vacuoles are frequently homologous to those which have important virulence functions for other bacteria. Research has identified that some of these conserved proteases have evolved specialised functions for intracellular vacuole residing bacteria. Unique proteases with pathogenic functions have also been described from Chlamydia, Mycobacteria, and Legionella. These findings suggest that there are further novel functions for proteases from these bacteria which remain to be described. This review summarises recent findings of novel protease functions from the intracellular human pathogenic bacteria which reside exclusively in vacuoles.
Resumo:
This paper studies the evolution of tax morale in Spain in the post-France era. In contrast to the previous tax compliance literature, the current paper investigates tax morale as the dependent variable and attempts to answer what actually shapes tax morale. Te analysis uses suevey data from two sources; the World Values Survey and the European Values Survey, allowing us to observe tax morale in Spain for the years 1981,1990, 1995 and 1999/2000. The sutudy of evolution of tax morale in Spain over nearly a 20-year span is particularly interesting because the political and fiscal system evolved very rapidly during this period.
Resumo:
In a much anticipated judgment, the Federal Circuit has sought to clarify the standards applicable in determining whether a claimed method constitutes patent-eligible subject matter. In Bilski, the Federal Circuit identified a test to determine whether a patentee has made claims that pre-empt the use of a fundamental principle or an abstract idea or whether those claims cover only a particular application of a fundamental principle or abstract idea. It held that the sole test for determining subject matter eligibility for a claimed process under § 101 is that: (1) it is tied to a particular machine or apparatus, or (2) it transforms a particular article into a different state or thing. The court termed this the “machine-or-transformation test.” In so doing it overruled its earlier State Street decision to the extent that it deemed its “useful, tangible and concrete result” test as inadequate to determine whether an alleged invention recites patent-eligible subject matter.
Resumo:
Our understanding of how the environment can impact human health has evolved and expanded over the centuries, with concern and interest dating back to ancient times. For example, over 4000 years ago, a civilisation in northern India tried to protect the health of its citizens by constructing and positioning buildings according to strict building laws, by having bathrooms and drains, and by having paved streets with a sewerage system (Rosen 1993). In more recent times, the ‘industrial revolution’ played a dominant role in shaping the modern world, and with it the modern public health system. This era was signified by rapid progress in technology, the growth of transportation and the expansion of the market economy, which lead to the organisation of industry into a factory system. This meant that labour had to be brought to the factories and by the 1820s, poverty and social distress (including overcrowding and infrequent sewage and garbage disposal) was more widespread than ever. These circumstances, therefore, lead to the rise of the ‘sanitary revolution’ and the birth of modern public health (Rosen 1993). The sanitary revolution has also been described as constituting the beginning of the first wave of environmental concern, which continued until after World War 2 when major advances in engineering and chemistry substantially changed the face of industry, particularly the chemical sector. The second wave of environmental concern came in the mid to late 20th century and was dominated by the environmental or ecology movement. A landmark in this era was the 1962 publication of the book Silent Spring by Rachel Carson. This identified for the first time the dramatic effects on the ecosystem of the widespread use of the organochlorine pesticide, DDT. The third wave of environmental concern commenced in the 1980s and continues today. The accelerated rate of economic development, the substantial increase in the world population and the globalisation of trade have dramatically changed the production methods and demand for goods in both developed and developing countries. This has lead to the rise of ‘sustainable development’ as a key driver in environmental planning and economic development (Yassi et al 2001). The protection of health has, therefore, been a hallmark of human history and is the cornerstone of public health practice. This chapter introduces environmental health and how it is managed in Australia, including a discussion of the key generic management tools. A number of significant environmental health issues and how they are specifically managed are then discussed, and the chapter concludes by discussing sustainable development and its links with environmental health.
Resumo:
Fracture behavior of Cu-Ni laminate composites has been investigated by tensile testing. It was found that as the individual layer thickness decreases from 100 to 20nm, the resultant fracture angle of the Cu-Ni laminate changes from 72 degrees to 50 degrees. Cross-sectional observations reveal that the fracture of the Ni layers transforms from opening to shear mode as the layer thickness decreases while that of the Cu layers keeps shear mode. Competition mechanisms were proposed to understand the variation in fracture mode of the metallic laminate composites associated with length scale.
Resumo:
Thermogravimetric analysis-mass spectrometry, X-ray diffraction and scanning electron microscopy (SEM) were used to characterize eight kaolinite samples from China. The results show that the thermal decomposition occurs in three main steps (a) desorption of water below 100 °C, (b) dehydration at about 225 °C, (c) well defined dehydroxylation at around 450 °C. It is also found that decarbonization took place at 710 °C due to the decomposition of calcite impurity in kaolin. The temperature of dehydroxylation of kaolinite is found to be influenced by the degree of disorder of the kaolinite structure and the gases evolved in the decomposition process can be various because of the different amount and kinds of impurities. It is evident by the mass spectra that the interlayer carbonate from impurity of calcite and organic carbon is released as CO2 around 225, 350 and 710 °C in the kaolinite samples.
Resumo:
This paper presents a simple and intuitive approach to determining the kinematic parameters of a serial-link robot in Denavit– Hartenberg (DH) notation. Once a manipulator’s kinematics is parameterized in this form, a large body of standard algorithms and code implementations for kinematics, dynamics, motion planning, and simulation are available. The proposed method has two parts. The first is the “walk through,” a simple procedure that creates a string of elementary translations and rotations, from the user-defined base coordinate to the end-effector. The second step is an algebraic procedure to manipulate this string into a form that can be factorized as link transforms, which can be represented in standard or modified DH notation. The method allows for an arbitrary base and end-effector coordinate system as well as an arbitrary zero joint angle pose. The algebraic procedure is amenable to computer algebra manipulation and a Java program is available as supplementary downloadable material.
Resumo:
Agriculture accounts for a significant portion of the GDP in most developed countries. However, managing farms, particularly largescale extensive farming systems, is hindered by lack of data and increasing shortage of labour. We have deployed a large heterogeneous sensor network on a working farm to explore sensor network applications that can address some of the issues identified above. Our network is solar powered and has been running for over 6 months. The current deployment consists of over 40 moisture sensors that provide soil moisture profiles at varying depths, weight sensors to compute the amount of food and water consumed by animals, electronic tag readers, up to 40 sensors that can be used to track animal movement (consisting of GPS, compass and accelerometers), and 20 sensor/actuators that can be used to apply different stimuli (audio, vibration and mild electric shock) to the animal. The static part of the network is designed for 24/7 operation and is linked to the Internet via a dedicated high-gain radio link, also solar powered. The initial goals of the deployment are to provide a testbed for sensor network research in programmability and data handling while also being a vital tool for scientists to study animal behavior. Our longer term aim is to create a management system that completely transforms the way farms are managed.
Resumo:
Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.
Resumo:
The state-owned media system in China has evolved considerably since 1994 when the first independent TV production company was officially registered. Today, there are thousands of independent TV production companies looking for market opportunities in China. Independent production companies have facilitated the circulation of program trade and investment, and in the process have encouraged innovation and professionalization. This paper focuses on the evolution of independents and the changing face of the television market. It discusses the ecology of independent television companies in China and how government regulations are impacting on the TV production market. It argues that independent TV is providing a new strength for China‟s TV market, one often suspected of being imitative, propagandistic and lacking colour.
Resumo:
What happens when the traditional framing mechanisms of our performance environments are removed and we are forced as directors to work with actors in digital environments that capture performance in 360 degrees? As directors contend with the challenges of interactive performance, the emergence of the online audience and the powerful influence of the games industry, how can we approach the challenges of directing work that is performance captured and presented in real time using motion capture and associated 3D imaging software? The 360 degree real time capture of performance, while allowing for an unlimited amount of framing potential, demands a unique and uncompromisingly disciplined style of direction and performance that has thus far remained unstudied and unquantified. By a close analysis of the groundbreaking work of artists like Robert Zemeckis and the Wetta Digital studio it is possible to begin to quantify what the technical requirements and challenges of 360 degree direction might be, but little has been discovered about the challenges of communicating the unlimited potential of framing and focus to the actors who work with these directors within these systems. It cannot be argued that the potential of theatrical space has evolved beyond the physical and moved into a more accessible virtual and digitised form, so how then can we direct for this unlimited potential and where do we place the focus of our directed (and captured) performance?
Resumo:
If one clear argument emerged from my doctoral thesis in political science, it is that there is no agreement as to what democracy is. There are over 40 different varieties of democracy ranging from those in the mainstream with subtle or minute differences to those playing by themselves in the corner. And many of these various types of democracy are very well argued, empirically supported, and highly relevant to certain polities. The irony is that the thing which all of these democratic varieties or the ‘basic democracy’ that all other forms of democracy stem from, is elusive. There is no international agreement in the literature or in political practice as to what ‘basic democracy’ is and that is problematic as many of us use the word ‘democracy’ every day and it is a concept of tremendous importance internationally. I am still uncertain as to why this problem has not been resolved before by far greater minds than my own, and it may have something to do with the recent growth in democratic theory this past decade and the innovative areas of thought my thesis required, but I think I’ve got the answer. By listing each type of democracy and filling the column next to this list with the literature associated with these various styles of democracy, I amassed a large and comprehensive body of textual data. My research intended to find out what these various styles of democracy had in common and to create a taxonomy (like the ‘tree of life’ in biology) of democracy to attempt at showing how various styles of democracy have ‘evolved’ over the past 5000 years.ii I then ran a word frequency analysis program or a piece of software that counts the 100 most commonly used words in the texts. This is where my logic came in as I had to make sense of these words. How did they answer what the most fundamental commonalities are between 40 different styles of democracy? I used a grounded theory analysis which required that I argue my way through these words to form a ‘theory’ or plausible explanation as to why these particular words and not others are the important ones for answering the question. It came down to the argument that all 40 styles of democracy analysed have the following in common 1) A concept of a citizenry. 2) A concept of sovereignty. 3) A concept of equality. 4) A concept of law. 5) A concept of communication. 6) And a concept of selecting officials. Thus, democracy is a defined citizenry with its own concept of sovereignty which it exercises through the institutions which support the citizenry’s understandings of equality, law, communication, and the selection of officials. Once any of these 6 concepts are defined in a particular way it creates a style of democracy. From this, we can also see that there can be more than one style of democracy active in a particular government as a citizenry is composed of many different aggregates with their own understandings of the six concepts.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.