13 resultados para Commercial Applications

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

LIght Detection And Ranging (LIDAR) data for terrain and land surveying has contributed to many environmental, engineering and civil applications. However, the analysis of Digital Surface Models (DSMs) from complex LIDAR data is still challenging. Commonly, the first task to investigate LIDAR data point clouds is to separate ground and object points as a preparatory step for further object classification. In this paper, the authors present a novel unsupervised segmentation algorithm-skewness balancing to separate object and ground points efficiently from high resolution LIDAR point clouds by exploiting statistical moments. The results presented in this paper have shown its robustness and its potential for commercial applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the important goals of the intelligent buildings especially in commercial applications is not only to minimize the energy consumption but also to enhance the occupant’s comfort. However, most of current development in the intelligent buildings focuses on an implementation of the automatic building control systems that can support energy efficiency approach. The consideration of occupants’ preferences is not adequate. To improve occupant’s wellbeing and energy efficiency in intelligent environments, we develop four types of agent combined together to form a multi-agent system to control the intelligent buildings. Users’ preferential conflicts are discussed. Furthermore, a negotiation mechanism for conflict resolution, has been proposed in order to reach an agreement, and has been represented in syntax directed translation schemes for future implementation and testing. Keywords: conflict resolution, intelligent buildings, multi-agent systems (MAS), negotiation strategy, syntax directed translation schemes (SDTS).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent activity in the development of future weather data for building performance simulation follows recognition of the limitations of traditional methods, which have been based on a stationary (observed) climate. In the UK, such developments have followed on from the availability of regional climate models as delivered in UKCIP02 and recently the probabilistic projections released under UKCP09. One major area of concern is the future performance and adaptability of buildings which employ exclusively passive or low-energy cooling systems. One such method which can be employed in an integral or retrofit situation is direct or indirect evaporative cooling. The effectiveness of evaporative cooling is most strongly influenced by the wet-bulb depression of the ambient air, hence is generally regarded as most suited to hot, dry climates. However, this technology has been shown to be effective in the UK, primarily in mixed-mode buildings or as a retrofit to industrial/commercial applications. Climate projections for the UK generally indicate an increase in the summer wet-bulb depression, suggesting an enhanced potential for the application of evaporative cooling. The paper illustrates this potential by an analysis of the probabilistic scenarios released under UKCP09, together with a detailed building/plant simulation of case study building located in the South-East of England. The results indicate a high probability that evaporative cooling will still be a viable low-energy technique in the 2050s.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The induction of classification rules from previously unseen examples is one of the most important data mining tasks in science as well as commercial applications. In order to reduce the influence of noise in the data, ensemble learners are often applied. However, most ensemble learners are based on decision tree classifiers which are affected by noise. The Random Prism classifier has recently been proposed as an alternative to the popular Random Forests classifier, which is based on decision trees. Random Prism is based on the Prism family of algorithms, which is more robust to noise. However, like most ensemble classification approaches, Random Prism also does not scale well on large training data. This paper presents a thorough discussion of Random Prism and a recently proposed parallel version of it called Parallel Random Prism. Parallel Random Prism is based on the MapReduce programming paradigm. The paper provides, for the first time, novel theoretical analysis of the proposed technique and in-depth experimental study that show that Parallel Random Prism scales well on a large number of training examples, a large number of data features and a large number of processors. Expressiveness of decision rules that our technique produces makes it a natural choice for Big Data applications where informed decision making increases the user’s trust in the system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The utility of the decimal growth stage (DGS) scoring system for cereals is reviewed. The DGS is the most widely used scale in academic and commercial applications because of its comprehensive coverage of cereal developmental stages, the ease of use and definition provided and adoption by official agencies. The DGS has demonstrable and established value in helping to optimise the timing of agronomic inputs, particularly with regard to plant growth regulators, herbicides, fungicides and soluble nitrogen fertilisers. In addition, the DGS is used to help parameterise crop models, and also in understanding the response and adaptation of crops to the environment. The value of the DGS for increasing precision relies on it indicating, to some degree, the various stages in the development of the stem apex and spike. Coincidence of specific growth stage scores with the transition of the apical meristem from a vegetative to a reproductive state, and also with the period of meiosis, is unreliable. Nonetheless, in pot experiments it is shown that the broad period of booting (DGS 41–49) appears adequate for covering the duration when the vulnerability of meiosis to drought and heat stress is exposed. Similarly, the duration of anthesis (61–69) is particularly susceptible to abiotic stresses: initially from a fertility perspective, but increasingly from a mean grain weight perspective as flowering progresses to DGS 69 and then milk development. These associations with DGS can have value at the crop level of organisation: for interpreting environmental effects, and in crop modelling. However, genetic, biochemical and physiological analysis to develop greater understanding of stress acclimation during the vegetative state, and tolerance at meiosis, does require more precision than DGS can provide. Similarly, individual floret analysis is needed to further understand the genetic basis of stress tolerance during anthesis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hydrogels have become very popular due to their unique properties such as high water content, softness, flexibility and biocompatibility. Natural and synthetic hydrophilic polymers can be physically or chemically cross-linked in order to produce hydrogels. Their resemblance to living tissue opens up many opportunities for applications in biomedical areas. Currently, hydrogels are used for manufacturing contact lenses, hygiene products, tissue engineering scaffolds, drug delivery systems and wound dressings. This review provides an analysis of their main characteristics and biomedical applications. From Wichterle’s pioneering work to the most recent hydrogel-based inventions and products on the market, it provides the reader with a detailed introduction to the topic and perspective on further potential developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thin slices of soft flexible solids have negligible bending resistance and hence store negligible elastic strain energy; furthermore such offcuts are rarely permanently deformed after slicing. Cutting forces thus depend only on work of separation (toughness work) and friction. These simplifying assumptions are not as restrictive as it might seem, and the mechanics are found to apply to a wide variety of foodstuffs and biological materials. The fracture toughness of such materials may be determined from cutting experiments: the use of scissors instrumented for load and displacement is a popular method where toughness is obtained from the work areas beneath load–displacement plots. Surprisingly, there is no analysis for the variation of forces with scissor blade opening and this paper provides the theory. Comparison is made with experimental results in cutting with scissors. The analysis is generalised to cutting with blades of variable curvature and applied to a commercial food cutting device having a rotating spiral plan form blade. The strong influence of the ‘slice/push ratio’ (blade tangential speed to blade edge normal speed) on the cutting forces is revealed. Small cutting forces are important in food cutting machinery as damage to slices is minimised. How high slice/push ratios may be achieved by choice of blade profile is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes the implementation of an offline, low-cost Brain Computer Interface (BCI) alternative to more expensive commercial models. Using inexpensive general purpose clinical EEG acquisition hardware (Truscan32, Deymed Diagnostic) as the base unit, a synchronisation module was constructed to allow the EEG hardware to be operated precisely in time to allow for recording of automatically time stamped EEG signals. The synchronising module allows the EEG recordings to be aligned in stimulus time locked fashion for further processing by the classifier to establish the class of the stimulus, sample by sample. This allows for the acquisition of signals from the subject’s brain for the goal oriented BCI application based on the oddball paradigm. An appropriate graphical user interface (GUI) was constructed and implemented as the method to elicit the required responses (in this case Event Related Potentials or ERPs) from the subject.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we report the degree of reliability of image sequences taken by off-the-shelf TV cameras for modeling camera rotation and reconstructing 3D structure using computer vision techniques. This is done in spite of the fact that computer vision systems usually use imaging devices that are specifically designed for the human vision. Our scenario consists of a static scene and a mobile camera moving through the scene. The scene is any long axial building dominated by features along the three principal orientations and with at least one wall containing prominent repetitive planar features such as doors, windows bricks etc. The camera is an ordinary commercial camcorder moving along the axial axis of the scene and is allowed to rotate freely within the range +/- 10 degrees in all directions. This makes it possible that the camera be held by a walking unprofessional cameraman with normal gait, or to be mounted on a mobile robot. The system has been tested successfully on sequence of images of a variety of structured, but fairly cluttered scenes taken by different walking cameramen. The potential application areas of the system include medicine, robotics and photogrammetry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nature of private commercial real estate markets presents difficulties for monitoring market performance. Assets are heterogeneous and spatially dispersed, trading is infrequent and there is no central marketplace in which prices and cash flows of properties can be easily observed. Appraisal based indices represent one response to these issues. However, these have been criticised on a number of grounds: that they may understate volatility, lag turning points and be affected by client influence issues. Thus, this paper reports econometrically derived transaction based indices of the UK commercial real estate market using Investment Property Databank (IPD) data, comparing them with published appraisal based indices. The method is similar to that presented by Fisher, Geltner, and Pollakowski (2007) and used by Massachusett, Institute of Technology (MIT) on National Council of Real Estate Investment Fiduciaries (NCREIF) data, although it employs value rather than equal weighting. The results show stronger growth from the transaction based indices in the run up to the peak in the UK market in 2007. They also show that returns from these series are more volatile and less autocorrelated than their appraisal based counterparts, but, surprisingly, differences in turning points were not found. The conclusion then debates the applications and limitations these series have as measures of market performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

tMelt-polycondensation of succinic acid anhydride with oxazoline-based diol monomers gave hyper-branched polymers with carboxylicacids terminal groups.1H NMR and quantitative13C NMRspectroscopy coupled with DEPT-13513C NMR experiment showed high degrees of branching (over 60%).Esterification of the acid end groups by addition of citronellol at 160◦C produced novel white spirit solubleresins which were characterized by Fourier transform-infrared (FTIR) spectroscopy, gel permeation chro-matography (GPC), differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA). Blendsof the new hyperbranched materials with commercial alkyd resins resulted in a dramatic, concentrationdependent drop in viscosity. Solvent-borne coatings were formulated containing the hyperbranchedpolymers. Dynamic mechanical analysis studies revealed that the air drying rates of the new coatingsystems were enhanced compared with identical formulations containing only commercial alkyd resins.