187 resultados para Loaders (Machines)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the graphics race subsides and gamers grow weary of predictable and deterministic game characters, game developers must put aside their “old faithful” finite state machines and look to more advanced techniques that give the users the gaming experience they crave. The next industry breakthrough will be with characters that behave realistically and that can learn and adapt, rather than more polygons, higher resolution textures and more frames-per-second. This paper explores the various artificial intelligence techniques that are currently being used by game developers, as well as techniques that are new to the industry. The techniques covered in this paper are finite state machines, scripting, agents, flocking, fuzzy logic and fuzzy state machines decision trees, neural networks, genetic algorithms and extensible AI. This paper introduces each of these technique, explains how they can be applied to games and how commercial games are currently making use of them. Finally, the effectiveness of these techniques and their future role in the industry are evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have used microarray gene expression profiling and machine learning to predict the presence of BRAF mutations in a panel of 61 melanoma cell lines. The BRAF gene was found to be mutated in 42 samples (69%) and intragenic mutations of the NRAS gene were detected in seven samples (11%). No cell line carried mutations of both genes. Using support vector machines, we have built a classifier that differentiates between melanoma cell lines based on BRAF mutation status. As few as 83 genes are able to discriminate between BRAF mutant and BRAF wild-type samples with clear separation observed using hierarchical clustering. Multidimensional scaling was used to visualize the relationship between a BRAF mutation signature and that of a generalized mitogen-activated protein kinase (MAPK) activation (either BRAF or NRAS mutation) in the context of the discriminating gene list. We observed that samples carrying NRAS mutations lie somewhere between those with or without BRAF mutations. These observations suggest that there are gene-specific mutation signals in addition to a common MAPK activation that result from the pleiotropic effects of either BRAF or NRAS on other signaling pathways, leading to measurably different transcriptional changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RÉSUMÉ. La prise en compte des troubles de la communication dans l’utilisation des systèmes de recherche d’information tels qu’on peut en trouver sur le Web est généralement réalisée par des interfaces utilisant des modalités n’impliquant pas la lecture et l’écriture. Peu d’applications existent pour aider l’utilisateur en difficulté dans la modalité textuelle. Nous proposons la prise en compte de la conscience phonologique pour assister l’utilisateur en difficulté d’écriture de requêtes (dysorthographie) ou de lecture de documents (dyslexie). En premier lieu un système de réécriture et d’interprétation des requêtes entrées au clavier par l’utilisateur est proposé : en s’appuyant sur les causes de la dysorthographie et sur les exemples à notre disposition, il est apparu qu’un système combinant une approche éditoriale (type correcteur orthographique) et une approche orale (système de transcription automatique) était plus approprié. En second lieu une méthode d’apprentissage automatique utilise des critères spécifiques , tels que la cohésion grapho-phonémique, pour estimer la lisibilité d’une phrase, puis d’un texte. ABSTRACT. Most applications intend to help disabled users in the information retrieval process by proposing non-textual modalities. This paper introduces specific parameters linked to phonological awareness in the textual modality. This will enhance the ability of systems to deal with orthographic issues and with the adaptation of results to the reader when for example the reader is dyslexic. We propose a phonology based sentence level rewriting system that combines spelling correction, speech synthesis and automatic speech recognition. This has been evaluated on a corpus of questions we get from dyslexic children. We propose a specific sentence readability measure that involves phonetic parameters such as grapho-phonemic cohesion. This has been learned on a corpus of reading time of sentences read by dyslexic children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research explores music in space, as experienced through performing and music-making with interactive systems. It explores how musical parameters may be presented spatially and displayed visually with a view to their exploration by a musician during performance. Spatial arrangements of musical components, especially pitches and harmonies, have been widely studied in the literature, but the current capabilities of interactive systems allow the improvisational exploration of these musical spaces as part of a performance practice. This research focuses on quantised spatial organisation of musical parameters that can be categorised as grid music systems (GMSs), and interactive music systems based on them. The research explores and surveys existing and historical uses of GMSs, and develops and demonstrates the use of a novel grid music system designed for whole body interaction. Grid music systems provide plotting of spatialised input to construct patterned music on a two-dimensional grid layout. GMSs are navigated to construct a sequence of parametric steps, for example a series of pitches, rhythmic values, a chord sequence, or terraced dynamic steps. While they are conceptually simple when only controlling one musical dimension, grid systems may be layered to enable complex and satisfying musical results. These systems have proved a viable, effective, accessible and engaging means of music-making for the general user as well as the musician. GMSs have been widely used in electronic and digital music technologies, where they have generally been applied to small portable devices and software systems such as step sequencers and drum machines. This research shows that by scaling up a grid music system, music-making and musical improvisation are enhanced, gaining several advantages: (1) Full body location becomes the spatial input to the grid. The system becomes a partially immersive one in four related ways: spatially, graphically, sonically and musically. (2) Detection of body location by tracking enables hands-free operation, thereby allowing the playing of a musical instrument in addition to “playing” the grid system. (3) Visual information regarding musical parameters may be enhanced so that the performer may fully engage with existing spatial knowledge of musical materials. The result is that existing spatial knowledge is overlaid on, and combined with, music-space. Music-space is a new concept produced by the research, and is similar to notions of other musical spaces including soundscape, acoustic space, Smalley's “circumspace” and “immersive space” (2007, 48-52), and Lotis's “ambiophony” (2003), but is rather more textural and “alive”—and therefore very conducive to interaction. Music-space is that space occupied by music, set within normal space, which may be perceived by a person located within, or moving around in that space. Music-space has a perceivable “texture” made of tensions and relaxations, and contains spatial patterns of these formed by musical elements such as notes, harmonies, and sounds, changing over time. The music may be performed by live musicians, created electronically, or be prerecorded. Large-scale GMSs have the capability not only to interactively display musical information as music representative space, but to allow music-space to co-exist with it. Moving around the grid, the performer may interact in real time with musical materials in music-space, as they form over squares or move in paths. Additionally he/she may sense the textural matrix of the music-space while being immersed in surround sound covering the grid. The HarmonyGrid is a new computer-based interactive performance system developed during this research that provides a generative music-making system intended to accompany, or play along with, an improvising musician. This large-scale GMS employs full-body motion tracking over a projected grid. Playing with the system creates an enhanced performance employing live interactive music, along with graphical and spatial activity. Although one other experimental system provides certain aspects of immersive music-making, currently only the HarmonyGrid provides an environment to explore and experience music-space in a GMS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to their large surface area, complex chemical composition and high alveolar deposition rate, ultrafine particles (UFPs) (< 0.1 ìm) pose a significant risk to human health and their toxicological effects have been acknowledged by the World Health Organisation. Since people spend most of their time indoors, there is a growing concern about the UFPs present in some indoor environments. Recent studies have shown that office machines, in particular laser printers, are a significant indoor source of UFPs. The majority of printer-generated UFPs are organic carbon and it is unlikely that these particles are emitted directly from the printer or its supplies (such as paper and toner powder). Thus, it was hypothesised that these UFPs are secondary organic aerosols (SOA). Considering the widespread use of printers and human exposure to these particles, understanding the processes involved in particle formation is of critical importance. However, few studies have investigated the nature (e.g. volatility, hygroscopicity, composition, size distribution and mixing state) and formation mechanisms of these particles. In order to address this gap in scientific knowledge, a comprehensive study including state-of-art instrumental methods was conducted to characterise the real-time emissions from modern commercial laser printers, including particles, volatile organic compounds (VOCs) and ozone (O3). The morphology, elemental composition, volatility and hygroscopicity of generated particles were also examined. The large set of experimental results was analysed and interpreted to provide insight into: (1) Emissions profiles of laser printers: The results showed that UFPs dominated the number concentrations of generated particles, with a quasi unimodal size distribution observed for all tests. These particles were volatile, non-hygroscopic and mixed both externally and internally. Particle microanalysis indicated that semi-volatile organic compounds occupied the dominant fraction of these particles, with only trace quantities of particles containing Ca and Fe. Furthermore, almost all laser printers tested in this study emitted measurable concentrations of VOCs and O3. A positive correlation between submicron particles and O3 concentrations, as well as a contrasting negative correlation between submicron particles and total VOC concentrations were observed during printing for all tests. These results proved that UFPs generated from laser printers are mainly SOAs. (2) Sources and precursors of generated particles: In order to identify the possible particle sources, particle formation potentials of both the printer components (e.g. fuser roller and lubricant oil) and supplies (e.g. paper and toner powder) were investigated using furnace tests. The VOCs emitted during the experiments were sampled and identified to provide information about particle precursors. The results suggested that all of the tested materials had the potential to generate particles upon heating. Nine unsaturated VOCs were identified from the emissions produced by paper and toner, which may contribute to the formation of UFPs through oxidation reactions with ozone. (3) Factors influencing the particle emission: The factors influencing particle emissions were also investigated by comparing two popular laser printers, one showing particle emissions three orders of magnitude higher than the other. The effects of toner coverage, printing history, type of paper and toner, and working temperature of the fuser roller on particle number emissions were examined. The results showed that the temperature of the fuser roller was a key factor driving the emission of particles. Based on the results for 30 different types of laser printers, a systematic positive correlation was observed between temperature and particle number emissions for printers that used the same heating technology and had a similar structure and fuser material. It was also found that temperature fluctuations were associated with intense bursts of particles and therefore, they may have impact on the particle emissions. Furthermore, the results indicated that the type of paper and toner powder contributed to particle emissions, while no apparent relationship was observed between toner coverage and levels of submicron particles. (4) Mechanisms of SOA formation, growth and ageing: The overall hypothesis that UFPs are formed by reactions with the VOCs and O3 emitted from laser printers was examined. The results proved this hypothesis and suggested that O3 may also play a role in particle ageing. In addition, knowledge about the mixing state of generated particles was utilised to explore the detailed processes of particle formation for different printing scenarios, including warm-up, normal printing, and printing without toner. The results indicated that polymerisation may have occurred on the surface of the generated particles to produce thermoplastic polymers, which may account for the expandable characteristics of some particles. Furthermore, toner and other particle residues on the idling belt from previous print jobs were a very clear contributing factor in the formation of laser printer-emitted particles. In summary, this study not only improves scientific understanding of the nature of printer-generated particles, but also provides significant insight into the formation and ageing mechanisms of SOAs in the indoor environment. The outcomes will also be beneficial to governments, industry and individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software forms an important part of the interface between citizens and their government. An increasing amount of government functions are being performed, controlled, or delivered electronically. This software, like all language, is never value-neutral, but must, to some extent, reflect the values of the coder and proprietor. The move that many governments are making towards e-governance, and the increasing reliance that is being placed upon software in government, necessitates a rethinking of the relationships of power and control that are embodied in software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main aim of this thesis is to analyse and optimise a public hospital Emergency Department. The Emergency Department (ED) is a complex system with limited resources and a high demand for these resources. Adding to the complexity is the stochastic nature of almost every element and characteristic in the ED. The interaction with other functional areas also complicates the system as these areas have a huge impact on the ED and the ED is powerless to change them. Therefore it is imperative that OR be applied to the ED to improve the performance within the constraints of the system. The main characteristics of the system to optimise included tardiness, adherence to waiting time targets, access block and length of stay. A validated and verified simulation model was built to model the real life system. This enabled detailed analysis of resources and flow without disruption to the actual ED. A wide range of different policies for the ED and a variety of resources were able to be investigated. Of particular interest was the number and type of beds in the ED and also the shift times of physicians. One point worth noting was that neither of these resources work in isolation and for optimisation of the system both resources need to be investigated in tandem. The ED was likened to a flow shop scheduling problem with the patients and beds being synonymous with the jobs and machines typically found in manufacturing problems. This enabled an analytic scheduling approach. Constructive heuristics were developed to reactively schedule the system in real time and these were able to improve the performance of the system. Metaheuristics that optimised the system were also developed and analysed. An innovative hybrid Simulated Annealing and Tabu Search algorithm was developed that out-performed both simulated annealing and tabu search algorithms by combining some of their features. The new algorithm achieves a more optimal solution and does so in a shorter time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is generally understood that the patent system exists to encourage the conception and disclosure of new and useful inventions embodied in machines and other physical devices, along with new methods that physically transform matter from one state to another. What is not well understood is whether, and to what extent, the patent system is to encourage and protect the conception and disclosure of inventions that are non-physical methods – namely those that do not result in a physical transformation of matter. This issue was considered in Grant v Commissioner of Patents. In that case the Full Court of the Federal Court of Australia held that an invention must involve a physical effect or transformation to be patentable subject matter. In doing so, it introduced a physicality requirement into Australian law. What this article seeks to establish is whether the court’s decision is consistent with the case law on point. It does so by examining the key common law cases that followed the High Court’s watershed decision in National Research Development Corporation v Commissioner of Patents, the undisputed authoritative statement of principle in regard to the patentable subject matter standard in Australia. This is done with a view to determining whether there is anything in those cases that supports the view that the Australian patentable subject matter test contains a physicality requirement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large amounts of money due to product recalls, consumer impact and subsequent loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and microorganisms to enter the package. In the food processing and packaging industry worldwide, there is an increasing demand for cost effective state of the art inspection technologies that are capable of reliably detecting leaky seals and delivering products at six-sigma. The new technology will develop non-destructive testing technology using digital imaging and sensing combined with a differential vacuum technique to assess seal integrity of food packages on a high-speed production line. The cost of leaky packages in Australian food industries is estimated close to AUD $35 Million per year. Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large sums of money due to product recalls, compensation claims and loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and micro-organisms to enter the package. Flexible plastic packages are widely used, and are the least expensive form of retaining the quality of the product. These packets can be used to seal, and therefore maximise, the shelf life of both dry and moist products. The seals of food packages need to be airtight so that the food content is not contaminated due to contact with microorganisms that enter as a result of air leakage. Airtight seals also extend the shelf life of packaged foods, and manufacturers attempt to prevent food products with leaky seals being sold to consumers. There are many current NDT (non-destructive testing) methods of testing the seal of flexible packages best suited to random sampling, and for laboratory purposes. The three most commonly used methods are vacuum/pressure decay, bubble test, and helium leak detection. Although these methods can detect very fine leaks, they are limited by their high processing time and are not viable in a production line. Two nondestructive in-line packaging inspection machines are currently available and are discussed in the literature review. The detailed design and development of the High-Speed Sensing and Detection System (HSDS) is the fundamental requirement of this project and the future prototype and production unit. Successful laboratory testing was completed and a methodical design procedure was needed for a successful concept. The Mechanical tests confirmed the vacuum hypothesis and seal integrity with good consistent results. Electrically, the testing also provided solid results to enable the researcher to move the project forward with a certain amount of confidence. The laboratory design testing allowed the researcher to confirm theoretical assumptions before moving into the detailed design phase. Discussion on the development of the alternative concepts in both mechanical and electrical disciplines enables the researcher to make an informed decision. Each major mechanical and electrical component is detailed through the research and design process. The design procedure methodically works through the various major functions both from a mechanical and electrical perspective. It opens up alternative ideas for the major components that although are sometimes not practical in this application, show that the researcher has exhausted all engineering and functionality thoughts. Further concepts were then designed and developed for the entire HSDS unit based on previous practice and theory. In the future, it would be envisaged that both the Prototype and Production version of the HSDS would utilise standard industry available components, manufactured and distributed locally. Future research and testing of the prototype unit could result in a successful trial unit being incorporated in a working food processing production environment. Recommendations and future works are discussed, along with options in other food processing and packaging disciplines, and other areas in the non-food processing industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This presentation explores molarization and overcoding of social machines and relationality within an assemblage consisting of empirical data of immigrant families in Australia. Immigration is key to sustainable development of Western societies like Australia and Canada. Newly arrived immigrants enter a country and are literally taken over by the Ministry of Immigration regarding housing, health, education and accessing job possibilities. If the immigrants do not know the official language(s) of the country, they enroll in language classes for new immigrants. Language classes do more than simply teach language. Language is presented in local contexts (celebrating the national day, what to do to get a job) and in control societies, language classes foreground values of a nation state in order for immigrants to integrate. In the current project, policy documents from Australia reveal that while immigration is the domain of government, the subject/immigrant is nevertheless at the core of policy. While support is provided, it is the subject/immigrant transcendent view that prevails. The onus remains on the immigrant to “succeed”. My perspective lies within transcendental empiricism and deploys Deleuzian ontology, how one might live in order to examine how segmetary lines of power (pouvoir) reflected in policy documents and operationalized in language classes rupture into lines of flight of nomad immigrants. The theoretical framework is Multiple Literacies Theory (MLT); reading is intensive and immanent. The participants are one Korean and one Sudanese family and their children who have recently immigrated to Australia. Observations in classrooms were obtained and followed by interviews based on the observations. Families also borrowed small video cameras and they filmed places, people and things relevant to them in terms of becoming citizen and immigrating to and living in a different country. Interviews followed. Rhizoanalysis informs the process of reading data. Rhizoanalysis is a research event and performed with an assemblage (MLT, data/vignettes, researcher, etc.). It is a way to work with transgressive data. Based on the concept of the rhizome, a bloc of data has no beginning, no ending. A researcher enters in the middle and exists somewhere in the middle, an intermezzo suggesting that the challenges to molar immigration lie in experimenting and creating molecular processes of becoming citizen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vibration analysis has been a prime tool in condition monitoring of rotating machines, however, its application to internal combustion engines remains a challenge because engine vibration signatures are highly non-stationary that are not suitable for popular spectrum-based analysis. Signal-to-noise ratio is a main concern in engine signature analysis due to severe background noise being generated by consecutive mechanical events, such as combustion, valve opening and closing, especially in multi-cylinder engines. Acoustic Emission (AE) has been found to give excellent signal-to-noise ratio allowing discrimination of fine detail of normal or abnormal events during a given cycle. AE has been used to detect faults, such as exhaust valve leakage, fuel injection behaviour, and aspects of the combustion process. This paper presents a review of AE application to diesel engine monitoring and preliminary investigation of AE signature measured on an 18-cylinder diesel engine. AE is compared with vibration acceleration for varying operating conditions: load and speed. Frequency characteristics of AE from those events are analysed in time-frequency domain via short time Fourier trasform. The result shows a great potential of AE analysis for detection of various defects in diesel engines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Multimodal Seaport Container Terminal (MSCT) is a complex system which requires careful planning and control in order to operate efficiently. It consists of a number of subsystems that require optimisation of the operations within them, as well as synchronisation of machines and containers between the various subsystems. Inefficiency in the terminal can delay ships from their scheduled timetables, as well as cause delays in delivering containers to their inland destinations, both of which can be very costly to their operators. The purpose of this PhD thesis is to use Operations Research methodologies to optimise and synchronise these subsystems as an integrated application. An initial model is developed for the overall MSCT; however, due to a large number of assumptions that had to be made, as well as other issues, it is found to be too inaccurate and infeasible for practical use. Instead, a method of developing models for each subsystem is proposed that then be integrated with each other. Mathematical models are developed for the Storage Area System (SAS) and Intra-terminal Transportation System (ITTS). The SAS deals with the movement and assignment of containers to stacks within the storage area, both when they arrive and when they are rehandled to retrieve containers below them. The ITTS deals with scheduling the movement of containers and machines between the storage areas and other sections of the terminal, such as the berth and road/rail terminals. Various constructive heuristics are explored and compared for these models to produce good initial solutions for large-sized problems, which are otherwise impractical to compute by exact methods. These initial solutions are further improved through the use of an innovative hyper-heuristic algorithm that integrates the SAS and ITTS solutions together and optimises them through meta-heuristic techniques. The method by which the two models can interact with each other as an integrated system will be discussed, as well as how this method can be extended to the other subsystems of the MSCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large margin learning approaches, such as support vector machines (SVM), have been successfully applied to numerous classification tasks, especially for automatic facial expression recognition. The risk of such approaches however, is their sensitivity to large margin losses due to the influence from noisy training examples and outliers which is a common problem in the area of affective computing (i.e., manual coding at the frame level is tedious so coarse labels are normally assigned). In this paper, we leverage the relaxation of the parallel-hyperplanes constraint and propose the use of modified correlation filters (MCF). The MCF is similar in spirit to SVMs and correlation filters, but with the key difference of optimizing only a single hyperplane. We demonstrate the superiority of MCF over current techniques on a battery of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new system is described for estimating volume from a series of multiplanar 2D ultrasound images. Ultrasound images are captured using a personal computer video digitizing card and an electromagnetic localization system is used to record the pose of the ultrasound images. The accuracy of the system was assessed by scanning four groups of ten cadaveric kidneys on four different ultrasound machines. Scan image planes were oriented either radially, in parallel or slanted at 30 C to the vertical. The cross-sectional images of the kidneys were traced using a mouse and the outline points transformed to 3D space using the Fastrak position and orientation data. Points on adjacent region of interest outlines were connected to form a triangle mesh and the volume of the kidneys estimated using the ellipsoid, planimetry, tetrahedral and ray tracing methods. There was little difference between the results for the different scan techniques or volume estimation algorithms, although, perhaps as expected, the ellipsoid results were the least precise. For radial scanning and ray tracing, the mean and standard deviation of the percentage errors for the four different machines were as follows: Hitachi EUB-240, −3.0 ± 2.7%; Tosbee RM3, −0.1 ± 2.3%; Hitachi EUB-415, 0.2 ± 2.3%; Acuson, 2.7 ± 2.3%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cities have long held a fascination for people – as they grow and develop, there is a desire to know and understand the intricate interplay of elements that makes cities ‘live’. In part, this is a need for even greater efficiency in urban centres, yet the underlying quest is for a sustainable urban form. In order to make sense of the complex entities that we recognise cities to be, they have been compared to buildings, organisms and more recently machines. However the search for better and more elegant urban centres is hardly new, healthier and more efficient settlements were the aim of Modernism’s rational sub-division of functions, which has been translated into horizontal distribution through zoning, or vertical organisation thought highrise developments. However both of these approaches have been found to be unsustainable, as too many resources are required to maintain this kind or urbanisation and social consequences of either horizontal or vertical isolation must also be considered. From being absolute consumers of resources, of energy and of technology, cities need to change, to become sustainable in order to be more resilient and more efficient in supporting culture, society as well as economy. Our urban centres need to be re-imagined, re-conceptualised and re-defined, to match our changing society. One approach is to re-examine the compartmentalised, mono-functional approach of urban Modernism and to begin to investigate cities like ecologies, where every element supports and incorporates another, fulfilling more than just one function. This manner of seeing the city suggests a framework to guide the re-mixing of urban settlements. Beginning to understand the relationships between supporting elements and the nature of the connecting ‘web’ offers an invitation to investigate the often ignored, remnant spaces of cities. This ‘negative space’ is the residual from which space and place are carved out in the Contemporary city, providing the link between elements of urban settlement. Like all successful ecosystems, cities need to evolve and change over time in order to effectively respond to different lifestyles, development in culture and society as well as to meet environmental challenges. This paper seeks to investigate the role that negative space could have in the reorganisation of the re-mixed city. The space ‘in-between’ is analysed as an opportunity for infill development or re-development which provides to the urban settlement the variety that is a pre-requisite for ecosystem resilience. An analysis of the urban form is suggested as an empirical tool to map the opportunities already present in the urban environment and negative space is evaluated as a key element in achieving a positive development able to distribute diverse environmental and social facilities in the city.