883 resultados para Performance evolution due time
Resumo:
Magnetic behavior of soils can seriously hamper the performance of geophysical sensors. Currently, we have little understanding of the types of minerals responsible for the magnetic behavior, as well as their distribution in space and evolution through time. This study investigated the magnetic characteristics and mineralogy of Fe-rich soils developed on basaltic substrate in Hawaii. We measured the spatial distribution of magnetic susceptibility (χlf) and frequency dependence (χfd%) across three test areas in a well-developed eroded soil on Kaho'olawe and in two young soils on the Big Island of Hawaii. X-ray diffraction spectroscopy, x-ray fluorescence spectroscopy (XFCF), chemical dissolution, thermal analysis, and temperature-dependent magnetic studies were used to characterize soil development and mineralogy for samples from soil pits on Kaho'olawe, surface samples from all three test areas, and unweathered basalt from the Big Island of Hawaii. The measurements show a general increase in magnetic properties with increasing soil development. The XRF Fe data ranged from 13% for fresh basalt and young soils on the Big Island to 58% for material from the B horizon of Kaho'olawe soils. Dithionite-extractable and oxalate-extractable Fe percentages increase with soil development and correlate with χlf-and χfd%, respectively. Results from the temperature-dependent susceptibility measurements show that the high soil magnetic properties observed in geophysical surveys in Kaho'olawe are entirely due to neoformed minerals. The results of our studies have implications for the existing soil survey of Kaho'olawe and help identify methods to characterize magnetic minerals in tropical soils.
Resumo:
This study uses the reverse salient methodology to contrast subsystems in video game consoles in order to discover, characterize, and forecast the most significant technology gap. We build on the current methodologies (Performance Gap and Time Gap) for measuring the magnitude of Reverse Salience, by showing the effectiveness of Performance Gap Ratio (PGR). The three subject subsystems in this analysis are the CPU Score, GPU core frequency, and video memory bandwidth. CPU Score is a metric developed for this project, which is the product of the core frequency, number of parallel cores, and instruction size. We measure the Performance Gap of each subsystem against concurrently available PC hardware on the market. Using PGR, we normalize the evolution of these technologies for comparative analysis. The results indicate that while CPU performance has historically been the Reverse Salient, video memory bandwidth has taken over as the quickest growing technology gap in the current generation. Finally, we create a technology forecasting model that shows how much the video RAM bandwidth gap will grow through 2019 should the current trend continue. This analysis can assist console developers in assigning resources to the next generation of platforms, which will ultimately result in longer hardware life cycles.
Resumo:
Existing process mining techniques provide summary views of the overall process performance over a period of time, allowing analysts to identify bottlenecks and associated performance issues. However, these tools are not de- signed to help analysts understand how bottlenecks form and dissolve over time nor how the formation and dissolution of bottlenecks – and associated fluctua- tions in demand and capacity – affect the overall process performance. This paper presents an approach to analyze the evolution of process performance via a notion of Staged Process Flow (SPF). An SPF abstracts a business process as a series of queues corresponding to stages. The paper defines a number of stage character- istics and visualizations that collectively allow process performance evolution to be analyzed from multiple perspectives. The approach has been implemented in the ProM process mining framework. The paper demonstrates the advantages of the SPF approach over state-of-the-art process performance mining tools using two real-life event logs publicly available.
Resumo:
We demonstrate the distinct glassy transport phenomena associated with the phase separated and spin-glass-like phases of La0.85Sr0.15CoO3, prepared under different heat-treatment conditions. The low-temperature annealed (phase-separated) sample, exhibits a small change in resistance, with evolution of time, as compared to the high-temperature annealed (spin glass) one. However, the resistance change as a function of time, in both cases, is well described by a stretched exponential fit, signifying the slow dynamics. Moreover, the ultraviolet spectroscopy study evidences a relatively higher density of states in the vicinity of EF for low-temperature annealed sample and this correctly points to its less semiconducting behavior.
Resumo:
One of the current challenges in evolutionary ecology is understanding the long-term persistence of contemporary-evolving predator–prey interactions across space and time. To address this, we developed an extension of a multi-locus, multi-trait eco-evolutionary individual-based model that incorporates several interacting species in explicit landscapes. We simulated eco-evolutionary dynamics of multiple species food webs with different degrees of connectance across soil-moisture islands. A broad set of parameter combinations led to the local extinction of species, but some species persisted, and this was associated with (1) high connectance and omnivory and (2) ongoing evolution, due to multi-trait genetic variability of the embedded species. Furthermore, persistence was highest at intermediate island distances, likely because of a balance between predation-induced extinction (strongest at short island distances) and the coupling of island diversity by top predators, which by travelling among islands exert global top-down control of biodiversity. In the simulations with high genetic variation, we also found widespread trait evolutionary changes indicative of eco-evolutionary dynamics. We discuss how the ever-increasing computing power and high-resolution data availability will soon allow researchers to start bridging the in vivo–in silico gap.
Resumo:
Actualmente se está empezando a consolidar una nueva forma de gestionar la conservación y mantenimiento de la red viaria de las Administraciones Públicas, basándose en contratos de colaboración público-privadas (PPP). Las motivaciones que están provocando este movimiento son de diversa índole. Por un lado, en el seno de la Unión Europea, existen serias restricciones presupuestarias debido al alto endeudamiento del sector público, lo que está llevando a buscar la forma óptima de disminuir el endeudamiento público, sin dejar de prestar servicios a la sociedad como la conservación y mantenimiento de las redes viarias. Por esta vertiente, se trata de convertir contratos convencionales de conservación viaria a esquemas de colaboración público-privada, donde se transferiría al sector privado el riesgo de disponibilidad de la vía mediante el uso de indicadores de calidad y servicio. Con esta transferencia de riesgo, junto con la transferencia del riesgo de demanda/construcción, no consolidaría la deuda de la sociedad de propósito específico constituida para la gestión del contrato de colaboración público-privada dentro de las cuentas públicas, con lo que se conseguiría no aumentar el déficit público, permitiendo continuar ofreciendo el servicio demandado por la sociedad. Por otro lado, la segunda motivación del desarrollo de este tipo de contratos, no tan economicista como la anterior y más enfocada a la gestión, se trata de utilizar los contratos de gestión basados en el uso de indicadores de calidad de servicio para mejorar las prestaciones de la red viaria competencia de una Administración. Con el uso de estos indicadores, el gestor tiene una herramienta muy útil para controlar la actividad del sector privado y asegurar que se ofrece un buen servicio. En la presente tesis, la investigación se ha centrado más en la vertiente de los indicadores de calidad relacionados con la gestión eficiente de las vías objeto de conservación y mantenimiento mediante el empleo de contratos de gestión privada que utilicen este tipo de herramientas de control, monitorización y gestión. En una primera parte, la presente tesis estudia el estado de la red de carreteras, referido principalmente a España, comparando su estado con el resto de redes de carreteras de Europa, detectando las principales carencias de la misma, sobre todo en cuanto a la gestión y conservación de firmes. En un segundo bloque, la tesis analiza el estado del arte de los nuevos procedimientos de gestión de la conservación y mantenimiento basados en indicadores de calidad del servicio en el mundo, destacándose que se trata de un tema relativamente reciente, con gran interés para el sector de la gestión y financiación de infraestructuras viarias. Al ser tan novedoso, por la falta de experiencias previas, las distintas Administración, tanto propias como foráneas, han pecado de un exceso de celo a la hora de establecer los umbrales sobre los que giran los distintos indicadores de calidad de servicio que permiten controlar la gestión de la conservación y mantenimiento de la vía. Partiendo de la labor de análisis descrita, la tesis realiza una investigación más detallada de los indicadores de calidad de servicio correspondientes a firmes bituminosos, debido a que estos indicadores son los más delicados y decisivos a la hora de realizar una correcta gestión de la vía a largo plazo. Dentro de los indicadores de firmes bituminosos, se ha realizado un modelo específico de evolución de comportamiento a lo largo del tiempo de la regularidad superficial, parámetro básico para numerosas Administraciones y organismos investigadores para poder conocer la evolución de un firme a lo largo del tiempo. A esta metodología se le ha dado el nombre de Modelo JRB para evaluar la racionalidad económica de indicadores de calidad asociados a parámetros de firmes. El modelo propuesto básicamente evalúa el valor óptimo desde la perspectiva económica que ha de tener el parámetro técnico que defina alguna propiedad del firme, aplicado a la definición de los indicadores de calidad de servicio. Esta visión del valor umbral del indicador deja a un lado consideraciones de equidad o de cualquier otra índole, basándose más en una visión económica. La metodología del Modelo JRB se puede aplicar a cualquier indicador de calidad relacionado con firmes, ya que lo que se obtiene es el valor óptimo económico que debería tener el umbral del indicador de calidad. El Modelo JRB consta de varias fases. En las primeras etapas el Modelo realiza el cálculo de los costes totales de transporte utilizando como herramienta el software HDM-IV desarrollado por el Banco Mundial. En etapas posteriores, el Modelo realiza análisis de sensibilidad para distintas propuestas de sección de firme, intensidades de tráfico y restricciones al parámetro técnico que define el indicador de calidad de servicio. Como ejercicio práctico de cara a contrastar la metodología del Modelo JRB se ha realizado un Caso de Estudio. Se ha tomado un tramo teórico, con características similares a la red de carreteras española, y con una flota vehicular similar a la española, donde se ha elegido como indicador de calidad la regularidad superficial (IRI). Con las sensibilidades realizadas con el Modelo JRB, se ha determinado el rango de valores que debería tener un indicador de calidad basado en el IRI para que dichos valores fueran óptimos desde la perspectiva económica Nowadays is becoming a new way to manage O&M (operation and maintenance) in public road networks, based on PPP contracts (public-private partnership). There are several issues which are driving this trend. On the one hand, EU (European Union) has serious budgetary constraints due to the high public sector borrowing. EU politicians are looking for the best way to reduce public debt, keeping services to society such as O&M of road networks. For this aspect, conventional O&M contracts are switching to PPP scenarios, where availability risk would be transfer to private sector using PI (performance indicators), along with demand risk transfer With this risk transference, along with the transfer of demand/construction risk, SPV (specific purpose vehicle) debt doesn’t consolidate in public accounts, so deficit wouldn’t increase, allowing the continuation of services demanded by society. On the other hand, the second motivation for developing this kind of contracts, not so economist as above and more focused to management, it is about using O&M contracts based on the use of PI to improve road network maintenance. Using these indicators, manager has a very useful tool to monitor private sector activity and ensure that it is provided a good service. In this thesis, the research has been focused on PI quality aspect, related with efficient management of PPP contracts for roads, which use these tools for control, monitoring and management. In the first part, this thesis examines the state of road network, based mainly in Spain, comparing with other road networks in Europe, identifying the main gaps in it, especially with regard to the management and maintenance of pavements. In a second block, the thesis analyzes the state of art of new O&M contracts based on PI in the world, emphasizing that they are relatively recent. These kinds of contracts have a great interest in road management and financing sector. Administrations all around the world have launch tenders with very exigent PI thresholds due to several factors: this knowledge is a new area, the lack of previous experiences and the variety of Administrations which have bid these contracts. Building on the described analysis, thesis develops a more detailed research about PI for bituminous pavements, because these PI are the most delicate and decisive in making a proper long term road management. Among bituminous pavements PI, IRI (International Roughness Index) has been analyzed with more detail and has been developed a specific model of behaviour evolution over time for evenness (IRI), basic parameter for many administrations and research departments in order to know the evolution of a pavement over time. This methodology has been given the name of JRB Model to evaluate the economic rationality of performance indicators associated with pavements parameters. The proposed model basically evaluates the optimal value from an economic perspective it must have the technical parameter which defines some pavement characteristic applied to the definition of performance indicators. This point of view of indicator value threshold sets aside justice considerations or otherwise, based more on an economic perspective. JRB Model methodology can be applied to any performance indicator associated to pavements, because what you get is the economic optimum threshold should have the performance indicator. JRB Model consists of several phases. In the early stages, the Model calculates transport total cost using HDM-IV software, developed by the World Bank, as a tool. In later stages, the Model performs sensitivity analyzes for different pavement section, AADT and restrictions to the technical parameter which defines the performance indicator. As a practical exercise to test JRB Model methodology, it has done a Case Study. It has taken a theoretical section, with similar characteristics to Spanish road network, and a vehicles fleet similar to Spanish. Evenness (IRI) was chosen as a performance indicator. JRB Model calculated some sensitivities, which were useful to determined thresholds range for pavement performance indicators based on IRI to be optimal from an economic perspective.
Resumo:
Research into the dynamicity of job performance criteria has found evidence suggesting the presence of rank-order changes to job performance scores across time as well as intraindividual trajectories in job performance scores across time. These findings have influenced a large body of research into (a) the dynamicity of validities of individual differences predictors of job performance and (b) the relationship between individual differences predictors of job performance and intraindividual trajectories of job performance. In the present dissertation, I addressed these issues within the context of the Five Factor Model of personality. The Five Factor Model is arranged hierarchically, with five broad higher-order factors subsuming a number of more narrowly tailored personality facets. Research has debated the relative merits of broad versus narrow traits for predicting job performance, but the entire body of research has addressed the issue from a static perspective -- by examining the relative magnitude of validities of global factors versus their facets. While research along these lines has been enlightening, theoretical perspectives suggest that the validities of global factors versus their facets may differ in their stability across time. Thus, research is needed to not only compare the relative magnitude of validities of global factors versus their facets at a single point in time, but also to compare the relative stability of validities of global factors versus their facets across time. Also necessary to advance cumulative knowledge concerning intraindividual performance trajectories is research into broad vs. narrow traits for predicting such trajectories. In the present dissertation, I addressed these issues using a four-year longitudinal design. The results indicated that the validities of global conscientiousness were stable across time, while the validities of conscientiousness facets were more likely to fluctuate. However, the validities of emotional stability and extraversion facets were no more likely to fluctuate across time than those of the factors. Finally, while some personality factors and facets predicted performance intercepts (i.e., performance at the first measurement occasion), my results failed to indicate a significant effect of any personality variable on performance growth. Implications for research and practice are discussed.
Resumo:
3D Motion capture is a medium that plots motion, typically human motion, converting it into a form that can be represented digitally. It is a fast evolving field and recent inertial technology may provide new artistic possibilities for its use in live performance. Although not often used in this context, motion capture has a combination of attributes that can provide unique forms of collaboration with performance arts. The inertial motion capture suit used for this study has orientation sensors placed at strategic points on the body to map body motion. Its portability, real-time performance, ease of use, and its immunity from line-of-sight problems inherent in optical systems suggest it would work well as a live performance technology. Many animation techniques can be used in real-time. This research examines a broad cross-section of these techniques using four practice-led cases to assess the suitability of inertial motion capture to live performance. Although each case explores different visual possibilities, all make use of the performativity of the medium, using either an improvisational format or interactivity among stage, audience and screen that would be difficult to emulate any other way. A real-time environment is not capable of reproducing the depth and sophistication of animation people have come to expect through media. These environments take many hours to render. In time the combination of what can be produced in real-time and the tools available in a 3D environment will no doubt create their own tree of aesthetic directions in live performance. The case study looks at the potential of interactivity that this technology offers.
Resumo:
When communicating emotion in music, composers and performers encode their expressive intentions through the control of basic musical features such as: pitch, loudness, timbre, mode, and articulation. The extent to which emotion can be controlled through the systematic manipulation of these features has not been fully examined. In this paper we present CMERS, a Computational Music Emotion Rule System for the control of perceived musical emotion that modifies features at the levels of score and performance in real-time. CMERS performance was evaluated in two rounds of perceptual testing. In experiment I, 20 participants continuously rated the perceived emotion of 15 music samples generated by CMERS. Three music works, each with five emotional variations were used (normal, happy, sad, angry, and tender). The intended emotion by CMERS was correctly identified 78% of the time, with significant shifts in valence and arousal also recorded, regardless of the works’ original emotion.
Resumo:
Differential axial deformation between column elements and shear wall elements of cores increase with building height and geometric complexity. Adverse effects due to the differential axial deformation reduce building performance and life time serviceability. Quantifying axial deformations using ambient measurements from vibrating wire, external mechanical and electronic strain gauges in order to acquire adequate provisions to mitigate the adverse effects is well established method. However, these gauges require installing in or on elements to acquire continuous measurements and hence use of these gauges is uneconomical and inconvenient. This motivates to develop a method to quantify the axial deformations. This paper proposes an innovative method based on modal parameters to quantify axial deformations of shear wall elements in cores of buildings. Capabilities of the method are presented though an illustrative example.
Resumo:
Facial expression is an important channel for human communication and can be applied in many real applications. One critical step for facial expression recognition (FER) is to accurately extract emotional features. Current approaches on FER in static images have not fully considered and utilized the features of facial element and muscle movements, which represent static and dynamic, as well as geometric and appearance characteristics of facial expressions. This paper proposes an approach to solve this limitation using ‘salient’ distance features, which are obtained by extracting patch-based 3D Gabor features, selecting the ‘salient’ patches, and performing patch matching operations. The experimental results demonstrate high correct recognition rate (CRR), significant performance improvements due to the consideration of facial element and muscle movements, promising results under face registration errors, and fast processing time. The comparison with the state-of-the-art performance confirms that the proposed approach achieves the highest CRR on the JAFFE database and is among the top performers on the Cohn-Kanade (CK) database.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
Twitter is the focus of much research attention, both in traditional academic circles and in commercial market and media research, as analytics give increasing insight into the performance of the platform in areas as diverse as political communication, crisis management, television audiencing and other industries. While methods for tracking Twitter keywords and hashtags have developed apace and are well documented, the make-up of the Twitter user base and its evolution over time have been less understood to date. Recent research efforts have taken advantage of functionality provided by Twitter's Application Programming Interface to develop methodologies to extract information that allows us to understand the growth of Twitter, its geographic spread and the processes by which particular Twitter users have attracted followers. From politicians to sporting teams, and from YouTube personalities to reality television stars, this technique enables us to gain an understanding of what prompts users to follow others on Twitter. This article outlines how we came upon this approach, describes the method we adopted to produce accession graphs and discusses their use in Twitter research. It also addresses the wider ethical implications of social network analytics, particularly in the context of a detailed study of the Twitter user base.
Resumo:
Australian forest industries have a long history of export trade of a wide range of products from woodchips(for paper manufacturing), sandalwood (essential oils, carving and incense) to high value musical instruments, flooring and outdoor furniture. For the high value group, fluctuating environmental conditions brought on by changes in mperature and relative humidity, can lead to performance problems due to consequential swelling, shrinkage and/or distortion of the wood elements. A survey determined the types of value-added products exported, including species and dimensions packaging used and export markets. Data loggers were installed with shipments to monitor temperature and relative humidity conditions. These data were converted to timber equilibrium moisture content values to provide an indication of the environment that the wood elements would be acclimatising to. The results of the initial survey indicated that primary high value wood export products included guitars, flooring, decking and outdoor furniture. The destination markets were mainly located in the northern hemisphere, particularly the United States of America, China, Hong Kong, Europe including the United Kingdom), Japan, Korea and the Middle East. Other regions importing Australian-made wooden articles were south-east Asia, New Zealand and South Africa. Different timber species have differing rates of swelling and shrinkage, so the types of timber were also recorded during the survey. Results from this work determined that the major species were ash-type eucalypts from south-eastern Australia (commonly referred to in the market as Tasmanian oak), jarrah from Western Australia, spotted gum, hoop pine, white cypress, black butt, brush box and Sydney blue gum from Queensland and New South Wales. The environmental conditions data indicated that microclimates in shipping containers can fluctuate extensively during shipping. Conditions at the time of manufacturing were usually between 10 and 12% equilibrium moisture content, however conditions during shipping could range from 5 (very dry) to 20% (very humid). The packaging systems incorporated were reported to be efficient at protecting the wooden articles from damage during transit. The research highlighted the potential risk for wood components to ‘move’ in response to periods of drier or more humid conditions than those at the time of manufacturing, and the importance of engineering a packaging system that can account for the environmental conditions experienced in shipping containers. Examples of potential dimensional changes in wooden components were calculated based on published unit shrinkage data for key species and the climatic data returned from the logging equipment. The information highlighted the importance of good design to account for possible timber movement during shipping. A timber movement calculator was developed to allow designers to input component species, dimensions, site of manufacture and destination, to see validate their product design. This calculator forms part of the free interactive website www.timbers.com.au.
Resumo:
In this paper, we present an analysis for the bit error rate (BER) performance of space-time block codes (STBC) from generalized complex orthogonal designs for M-PSK modulation. In STBCs from complex orthogonal designs (COD), the norms of the column vectors are the same (e.g., Alamouti code). However, in generalized COD (GCOD), the norms of the column vectors may not necessarily be the same (e.g., the rate-3/5 and rate-7/11 codes by Su and Xia in [1]). STBCs from GCOD are of interest because of the high rates that they can achieve (in [2], it has been shown that the maximum achievable rate for STBCs from GCOD is bounded by 4/5). While the BER performance of STBCs: from COD (e.g., Alamouti code) can be simply obtained from existing analytical expressions for receive diversity with the same diversity order by appropriately scaling the SNR, this can not be done for STBCs from GCOD (because of the unequal norms of the column vectors). Our contribution in this paper is that we derive analytical expressions for the BER performance of any STBC from GCOD. Our BER analysis for the GCOD captures the performance of STBCs from COD as special cases. We validate our results with two STBCs from GCOD reported by Su and Xia in [1], for 5 and 6 transmit antennas (G(5) and G(6) in [1]) with rates 7/11 and 3/5, respectively.