964 resultados para Inputs sensoriels
Resumo:
Ria de Aveiro, a Portuguese coastal lagoon that exchanges water with the Atlantic Ocean, received the effluent from a chlor-alkali industry for over 50 years; consequently several tons of mercury had been buried in the sediments of an inner basin. To assess the importance (and seasonal variation) of the lagoon waters as carriers of mercury to the nearby coastal area, we measured total mercury levels in several compartments: in surface sediments, in surface and deep waters (including dissolved and particulate matter!, and in biota. Dissolved (reactive and total) mercury concentrations both in surface and deep waters were low (<1 to 15 ng L '). Mean mercury values in suspended particulate matter varied hetween 0.2 and 0.6 jxg g ' and in sediments between 1 and 9 ng g '. Aquatic organisms displayed levels below regulatory limits but exhibited some bioaccumulation of mercury, with concentrations ranging from 0.05 to 0.8 ^ig g ' Idry weight (dw)|. No seasonal pattern was found in this study for mercury-related determinations. Levels found in the estuary mouth during ebb tide provide evidence for the transport of mercury to the coastal zone. No significant changes in the partition of mercury between dissolved and particulate phases were found in the coastal waters in comparison with the values found in the estuary mouth. In spite of the high levels of mercury found inside some areas of the lagoon, the wide web of islands and channels allows some spreading of contaminants before they reach the coastal waters. Moreover, the low efficiency of local marine sediments in trapping mercury contributes to a dilution of mercury transported in suspended particulate matter over a broader area, reducing the impact in the nearby manne coastal zone.
Resumo:
The chemical factors (inorganic nitrogen, phosphate, silicic acid) that potentially or actually control primary production were determined for the Bay of Brest, France, a macrotidal ecosystem submitted to high-nitrate-loaded freshwater inputs (winter nitrate freshwater concentrations >700 mu M, Si:N molar ratio as low as 0.2, i.e. among the lowest ever published). Intensive data collection and observations were carried out from February 1993 to March 1994 to determine the variations of physical [salinity, temperature, photosynthetically active radiation (PAR), freshwater discharges] and chemical (oxygen and nutrients) parameters and their impacts on the phytoplankton cycle (fluorescence, pigments, primary production). With insufficient PAR the winter stocks of nutrients were almost nonutilized and the nitrate excess was exported to the adjacent ocean, due to rapid tidal exchange. By early April, a diatom-dominated spring bloom developed (chlorophyll a maximum = 7.7 mu g l(-1); primary production maximum = 2.34 g C m(-2) d(-1)) under high initial nutrient concentrations. Silicic acid was rapidly exhausted over the whole water column; it is inferred to be the primary limiting factor responsible for the collapse of the spring bloom by mid-May. Successive phytoplankton developments characterized the period of secondary blooms during summer and fall (successive surface chlorophyll a maxima = 3.5, 1.6, 1.8 and 1.0 mu g l(-1); primary production = 1.24, 1.18 and 0.35 g C m(-2) d(-1)). Those secondary blooms developed under lower nutrient concentrations, mostly originating from nutrient recycling. Until August, Si and P most likely limited primary production, whereas the last stage of the productive period in September seemed to be N limited instead, this being a period of total nitrate depletion in almost the whole water column. Si limitation of spring blooms has become a common feature in coastal ecosystems that receive freshwater inputs with Si:N molar ratios <1. The peculiarity of Si Limitation in the Bay of Brest is its extension through the summer period.
Resumo:
Time series of physico-chemical data and concentrations (cell L-1) of the toxic dinoflagellate Alexandrium minutum collected in the Rance macrotidal estuary (Brittany, France) were analyzed to understand the physico-chemical processes of the estuary and their relation to changes in bloom development from 1996 to 2009. The construction of the tidal power plant in the north and the presence of a lock in the south have greatly altered hydrodynamics, blocking the zone of maximum turbidity upstream, in the narrowest part of the estuary. Alexandrium minutum occurs in the middle part of the estuary. Most physical and chemical parameters of the Rance estuary are similar to those observed elsewhere in Brittany with water temperatures between 15–18 °C, slightly lowered salinities (31.8–33.1 PSU), low river flow rates upstream and significant solar radiation (8 h day-1). A notable exception is phosphate input from the drainage basin which seems to limit bloom development: in recent years, bloom decline can be significantly correlated with the decrease in phosphate input. On the other hand, the chemical processes occurring in the freshwater-saltwater interface do not seem to have an influence on these occurrences. The other hypotheses for bloom declines are discussed, including the prevalence of parasitism, but remain to be verified in further studies.
Resumo:
Today’s snowmobile industry faces great challenges in the field of noise & vibration. The area of main concern is the pass-by noise restriction defined by the Society of Automotive Engineers (SAE) test standard J192, with a maximum sound pressure level of 78 dB(A) being required by many states and national parks. To continue meet or beat this requirement without effecting machine performance, a deeper understanding of the sound transfer paths is required. This thesis examines the transfer paths created by the tunnel, rear suspension, drive shaft, and rubber composite track, with the primary source being suspension input through the ground. Using a combination of field experiments and analytical modeling, perspective was gained on which suspension and drive elements create the primary transfer paths. With further understanding of these paths, industry can tailor and fine-tune the approaches taken in to control overall noise output.
Resumo:
The high degree of variability and inconsistency in cash flow study usage by property professionals demands improvement in knowledge and processes. Until recently limited research was being undertaken on the use of cash flow studies in property valuations but the growing acceptance of this approach for major investment valuations has resulted in renewed interest in this topic. Studies on valuation variations identify data accuracy, model consistency and bias as major concerns. In cash flow studies there are practical problems with the input data and the consistency of the models. This study will refer to the recent literature and identify the major factors in model inconsistency and data selection. A detailed case study will be used to examine the effects of changes in structure and inputs. The key variable inputs will be identified and proposals developed to improve the selection process for these key variables. The variables will be selected with the aid of sensitivity studies and alternative ways of quantifying the key variables explained. The paper recommends, with reservations, the use of probability profiles of the variables and the incorporation of this data in simulation exercises. The use of Monte Carlo simulation is demonstrated and the factors influencing the structure of the probability distributions of the key variables are outline. This study relates to ongoing research into functional performance of commercial property within an Australian Cooperative Research Centre.
Resumo:
Over the past several years, there has been resurgent interest in regional planning in North America, Europe and Australasia. Spurred by issues such as metropolitan growth, transportation infrastructure, environmental management and economic development, many states and metropolitan regions are undertaking new planning initiatives. These regional efforts have also raised significant question about governance structures, accountability and measures of effectiveness.n this paper, the authors conducted an international review of ten case studies from the United States, Canada, England, Belgium, New Zealand and Australia to explore several critical questions. Using qualitative data template, the research team reviewed plans, documents, web sites and published literature to address three questions. First, what are the governance arrangements for delivering regional planning? Second, what are the mechanisms linking regional plans with state plans (when relevant) and local plans? Third, what means and mechanisms do these regional plans use to evaluate and measure effectiveness? The case study analysis revealed several common themes. First, there is an increasing focus on goverance at the regional level, which is being driven by a range of trends, including regional spatial development initiatives in Europe, regional transportation issues in the US, and the growth of metropolitan regions generally. However, there is considerable variation in how regional governance arrangements are being played out. Similarly, there is a range of processes being used at the regional level to guide planning that range from broad ranging (thick) processes to narrow and limited (thin) approaches. Finally, evaluation and monitoring of regional planning efforts are compiling data on inputs, processes, outputs and outcomes. Although there is increased attention being paid to indicators and monitoring, most of it falls into outcome evaluations such as Agenda 21 or sustainability reporting. Based on our review we suggest there is a need for increased attention on input, process and output indicators and clearer linkages of these indicators in monitoring and evaluation frameworks. The focus on outcome indicators, such as sustainability indicators, creates feedback systems that are too long-term and remote for effective monitoring and feedback. Although we found some examples of where these kinds of monitoring frameworks are linked into a system of governance, there is a need for clearer conceptual development for both theory and practice.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
Amphibian is an 10’00’’ musical work which explores new musical interfaces and approaches to hybridising performance practices from the popular music, electronic dance music and computer music traditions. The work is designed to be presented in a range of contexts associated with the electro-acoustic, popular and classical music traditions. The work is for two performers using two synchronised laptops, an electric guitar and a custom designed gestural interface for vocal performers - the e-Mic (Extended Mic-stand Interface Controller). This interface was developed by one of the co-authors, Donna Hewitt. The e-Mic allows a vocal performer to manipulate the voice in real time through the capture of physical gestures via an array of sensors - pressure, distance, tilt - along with ribbon controllers and an X-Y joystick microphone mount. Performance data are then sent to a computer, running audio-processing software, which is used to transform the audio signal from the microphone. In this work, data is also exchanged between performers via a local wireless network, allowing performers to work with shared data streams. The duo employs the gestural conventions of guitarist and singer (i.e. 'a band' in a popular music context), but transform these sounds and gestures into new digital music. The gestural language of popular music is deliberately subverted and taken into a new context. The piece thus explores the nexus between the sonic and performative practices of electro acoustic music and intelligent electronic dance music (‘idm’). This work was situated in the research fields of new musical interfacing, interaction design, experimental music composition and performance. The contexts in which the research was conducted were live musical performance and studio music production. The work investigated new methods for musical interfacing, performance data mapping, hybrid performance and compositional practices in electronic music. The research methodology was practice-led. New insights were gained from the iterative experimental workshopping of gestural inputs, musical data mapping, inter-performer data exchange, software patch design, data and audio processing chains. In respect of interfacing, there were innovations in the design and implementation of a novel sensor-based gestural interface for singers, the e-Mic, one of the only existing gestural controllers for singers. This work explored the compositional potential of sharing real time performance data between performers and deployed novel methods for inter-performer data exchange and mapping. As regards stylistic and performance innovation, the work explored and demonstrated an approach to the hybridisation of the gestural and sonic language of popular music with recent ‘post-digital’ approaches to laptop based experimental music The development of the work was supported by an Australia Council Grant. Research findings have been disseminated via a range of international conference publications, recordings, radio interviews (ABC Classic FM), broadcasts, and performances at international events and festivals. The work was curated into the major Australian international festival, Liquid Architecture, and was selected by an international music jury (through blind peer review) for presentation at the International Computer Music Conference in Belfast, N. Ireland.
Resumo:
This paper deals with the problem of using the data mining models in a real-world situation where the user can not provide all the inputs with which the predictive model is built. A learning system framework, Query Based Learning System (QBLS), is developed for improving the performance of the predictive models in practice where not all inputs are available for querying to the system. The automatic feature selection algorithm called Query Based Feature Selection (QBFS) is developed for selecting features to obtain a balance between the relative minimum subset of features and the relative maximum classification accuracy. Performance of the QBLS system and the QBFS algorithm is successfully demonstrated with a real-world application
Resumo:
Buildings consume resources and energy, contribute to pollution of our air, water and soil, impact the health and well-being of populations and constitute an important part of the built environment in which we live. The ability to assess their design with a view to reducing that impact automatically from their 3D CAD representations enables building design professionals to make informed decisions on the environmental impact of building structures. Contemporary 3D object-oriented CAD files contain a wealth of building information. LCADesign has been designed as a fully integrated approach for automated eco-efficiency assessment of commercial buildings direct from 3D CAD. LCADesign accesses the 3D CAD detail through Industry Foundation Classes (IFCs) - the international standard file format for defining architectural and constructional CAD graphic data as 3D real-world objects - to permit construction professionals to interrogate these intelligent drawing objects for analysis of the performance of a design. The automated take-off provides quantities of all building components whose specific production processes, logistics and raw material inputs, where necessary, are identified to calculate a complete list of quantities for all products such as concrete, steel, timber, plastic etc and combines this information with the life cycle inventory database, to estimate key internationally recognised environmental indicators such as CML, EPS and Eco-indicator 99. This paper outlines the key modules of LCADesign and their role in delivering an automated eco-efficiency assessment for commercial buildings.
Resumo:
This paper provides new results about efficient arithmetic on Jacobi quartic form elliptic curves, y 2 = d x 4 + 2 a x 2 + 1. With recent bandwidth-efficient proposals, the arithmetic on Jacobi quartic curves became solidly faster than that of Weierstrass curves. These proposals use up to 7 coordinates to represent a single point. However, fast scalar multiplication algorithms based on windowing techniques, precompute and store several points which require more space than what it takes with 3 coordinates. Also note that some of these proposals require d = 1 for full speed. Unfortunately, elliptic curves having 2-times-a-prime number of points, cannot be written in Jacobi quartic form if d = 1. Even worse the contemporary formulae may fail to output correct coordinates for some inputs. This paper provides improved speeds using fewer coordinates without causing the above mentioned problems. For instance, our proposed point doubling algorithm takes only 2 multiplications, 5 squarings, and no multiplication with curve constants when d is arbitrary and a = ±1/2.
Resumo:
This paper describes the approach taken to the XML Mining track at INEX 2008 by a group at the Queensland University of Technology. We introduce the K-tree clustering algorithm in an Information Retrieval context by adapting it for document clustering. Many large scale problems exist in document clustering. K-tree scales well with large inputs due to its low complexity. It offers promising results both in terms of efficiency and quality. Document classification was completed using Support Vector Machines.
Resumo:
Both creative industries and innovation are slippery fish to handle conceptually, to say nothing of their relationship. This paper faces, first, the problems of definitions and data that can bedevil clear analysis of the creative industries. It then presents a method of data generation and analysis that has been developed to address these problems while providing an evidence pathway supporting the movement in policy thinking from creative output (through industry sectors) to creative input to the broader economy (through a focus on occupations/activity). Facing the test of policy relevance, this work has assisted in moving the ongoing debates about the creative industries toward innovation thinking by developing the concept of creative occupations as input value. Creative inputs as 'enablers' arguably has parallels with the way ICTs have been shown to be broad enablers of economic growth. We conclude with two short instantiations of the policy relevance of this concept: design as a creative input; and creative human capital and education.
Resumo:
A method of improving the security of biometric templates which satisfies desirable properties such as (a) irreversibility of the template, (b) revocability and assignment of a new template to the same biometric input, (c) matching in the secure transformed domain is presented. It makes use of an iterative procedure based on the bispectrum that serves as an irreversible transformation for biometric features because signal phase is discarded each iteration. Unlike the usual hash function, this transformation preserves closeness in the transformed domain for similar biometric inputs. A number of such templates can be generated from the same input. These properties are illustrated using synthetic data and applied to images from the FRGC 3D database with Gabor features. Verification can be successfully performed using these secure templates with an EER of 5.85%
Resumo:
Expert elicitation is the process of retrieving and quantifying expert knowledge in a particular domain. Such information is of particular value when the empirical data is expensive, limited, or unreliable. This paper describes a new software tool, called Elicitator, which assists in quantifying expert knowledge in a form suitable for use as a prior model in Bayesian regression. Potential environmental domains for applying this elicitation tool include habitat modeling, assessing detectability or eradication, ecological condition assessments, risk analysis, and quantifying inputs to complex models of ecological processes. The tool has been developed to be user-friendly, extensible, and facilitate consistent and repeatable elicitation of expert knowledge across these various domains. We demonstrate its application to elicitation for logistic regression in a geographically based ecological context. The underlying statistical methodology is also novel, utilizing an indirect elicitation approach to target expert knowledge on a case-by-case basis. For several elicitation sites (or cases), experts are asked simply to quantify their estimated ecological response (e.g. probability of presence), and its range of plausible values, after inspecting (habitat) covariates via GIS.