325 resultados para Borel Sets


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Jacques Rancière's work on aesthetics has received a great deal of attention in recent years. Given his work has enormous range – covering art and literature, political theory, historiography, pedagogy and worker's history – Andrew McNamara and Toni Ross (UNSW) explore his wider critical ambitions in this interview, while showing how it leads to alternative insights into aesthetics. Rancière sets aside the core suppositions linking the medium to aesthetic judgment, which has informed many definitions of modernism. Rancière is emphatic in freeing aesthetic judgment from issues of medium-specificity. He argues that the idea of autonomy associated with medium-specificity – or 'truth to the medium' – was 'a very late one' in modernism, and that post-medium trends were already evident in early modernism. While not stressing a simple continuity between early modernism and contemporary art, Ranciere nonetheless emphasizes the on-going ethical and political ramifications of maintaining an a-disciplinary stance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The CCI-Creative City Index was commissioned in 2010 by the Beijing Academy of Science & Technology's Beijing Research Center for the Science of Science. John Hartley was asked to develop a new creative global city index. The brief was to improve on the existing indexes with a specific focus on creative industries and the sources of creative development. This report, by John Hartley, Jason Potts, Trent MacDonald, with Chris Erkunt and Carl Kufleitner, sets out the new model we have developed, which we call the CCI Creative City Index (CCI-CCI). It presents the results of a pilot application of the index to six cities: London, Cardiff, Berlin, Bremen, Melbourne and Brisbane. The index incorporates many elements from other global and creative city indexes, but also adds several new dimensions relating to creative industries scope, micro-productivity, and the economy of attention. The report and Excel spreadsheets of index calculations can be found on this site.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cartilage defects heal imperfectly and osteoarthritic changes develop frequently as a result. Although the existence of specific behaviours of chondrocytes derived from various depth-related zones in vitro has been known for over 20 years, only a relatively small body of in vitro studies has been performed with zonal chondrocytes and current clinical treatment strategies do not reflect these native depth-dependent (zonal) differences. This is surprising since mimicking the zonal organization of articular cartilage in neo-tissue by the use of zonal chondrocyte subpopulations could enhance the functionality of the graft. Although some research groups including our own have made considerable progress in tailoring culture conditions using specific growth factors and biomechanical loading protocols, we conclude that an optimal regime has not yet been determined. Other unmet challenges include the lack of specific zonal cell sorting protocols and limited amounts of cells harvested per zone. As a result, the engineering of functional tissue has not yet been realized and no long-term in vivo studies using zonal chondrocytes have been described. This paper critically reviews the research performed to date and outlines our view of the potential future significance of zonal chondrocyte populations in regenerative approaches for the treatment of cartilage defects. Secondly, we briefly discuss the capabilities of additive manufacturing technologies that can not only create patient-specific grafts directly from medical imaging data sets but could also more accurately reproduce the complex 3D zonal extracellular matrix architecture using techniques such as hydrogel-based cell printing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The average structure (CI) of a volcanic plagioclase megacryst with composition Ano, from the Hogarth Ranges, Australia, has been determined using three-dimensional, singlecrystal neutron and X-ray diffraction data. Least squaresr efinements, incorporating anisotropic thermal motion of all atoms and an extinction correction, resulted in weighted R factors (based on intensities) of 0.076 and 0.056, respectively, for the neutron and X-ray data. Very weak e reflections could be detected in long-exposure X-ray and electron diffraction photographs of this crystal, but the refined average structure is believed to be unaffected by the presence of such a weak superstructure. The ratio of the scattering power of Na to that of Ca is different for X ray and neutron radiation, and this radiation-dependence of scattering power has been used to determine the distribution of Na and Ca over a split-atom M site (two sites designated M' and M") in this Ano, plagioclase. Relative peak-height ratios M'/M", revealed in difference Fourier sections calculated from neutron and X-ray data, formed the basis for the cation-distribution analysis. As neutron and X-ray data sets were directly compared in this analysis, it was important that systematic bias between refined neutron and X-ray positional parameters could be demonstrated to be absent. In summary, with an M-site model constrained only by the electron-microprobedetermined bulk composition of the crystal, the following values were obtained for the M-site occupanciesN: ar, : 0.29(7),N ar. : 0.23(7),C ar, : 0.15(4),a nd Car" : 0.33(4). These results indicate that restrictive assumptions about M sites, on which previous plagioclase refinements have been based, are not applicable to this Ano, and possibly not to the entire compositional range. T-site ordering determined by (T-O) bond-length variation-t,o : 0.51(l), trm = t2o = t2m = 0.32(l)-is weak, as might be expectedf rom the volcanic origin of this megacryst.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deciding the appropriate population size and number of is- lands for distributed island-model genetic algorithms is often critical to the algorithm’s success. This paper outlines a method that automatically searches for good combinations of island population sizes and the number of islands. The method is based on a race between competing parameter sets, and collaborative seeding of new parameter sets. This method is applicable to any problem, and makes distributed genetic algorithms easier to use by reducing the number of user-set parameters. The experimental results show that the proposed method robustly and reliably finds population and islands settings that are comparable to those found with traditional trial-and-error approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The behaviour of single installations of solar energy systems is well understood; however, what happens at an aggregated location, such as a distribution substation, when output of groups of installations cumulate is not so well understood. This paper considers groups of installations attached to distributions substations on which the load is primarily commercial and industrial. Agent-based modelling has been used to model the physical electrical distribution system and the behaviour of equipment outputs towards the consumer end of the network. The paper reports the approach used to simulate both the electricity consumption of groups of consumers and the output of solar systems subject to weather variability with the inclusion of cloud data from the Bureau of Meteorology (BOM). The data sets currently used are for Townsville, North Queensland. The initial characteristics that indicate whether solar installations are cost effective from an electricity distribution perspective are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While academic interest in destination branding has been gathering momentum since the field commenced in the late 1990s, one important gap in this literature that has received relatively little attention to date is the measurement of destination brand performance. This paper sets out one method for assessing the performance of a destination brand over time. The intent is to present an approach that will appeal to marketing practitioners, and which is also conceptually sound. The method is underpinned by Decision Set Theory and the concept of Consumer-Based Brand Equity (CBBE), while the key variables mirror the branding objectives used by many destination marketing organisations (DMO). The approach is demonstrated in this paper to measure brand performance for Australia in the New Zealand market. It is suggested the findings provide indicators of both i) the success of previous marketing communications, and ii) future performance, which can be easily communicated to a DMO’s stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology (IT) infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry’s technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry’s services to be offered through cloud-based “apps.”