527 resultados para key scheduling algorithm
Resumo:
3D printing (3Dp) has long been used in the manufacturing sector as a way to automate, accelerate production and reduce waste materials. It is able to build a wide variety of objects if the necessary specifications are provided to the printer and no problems are presented by the limited range of materials available. With 3Dp becoming cheaper, more reliable and, as a result, more prevalent in the world at large, it may soon make inroads into the construction industry. Little is known however, of 3Dp in current use the construction industry and its potential for the future and this paper seeks to rectify this situation by providing a review of the relevant literature. In doing this, the three main 3Dp methods of contour crafting, concrete printing and D-shape 3Dp are described which, as opposed to the traditional construction method of cutting materials down to size, deliver only what is needed for completion, vastly reducing waste. Also identified is 3Dp’s potential to enable buildings to be constructed many times faster and with significantly reduced labour costs. In addition, it is clear that construction 3Dp can allow the further inclusion of Building Information Modelling into the construction process - streamlining and improving the scheduling requirements of a project. However, current 3Dp processes are known to be costly, unsuited to large-scale products and conventional design approaches, and have a very limited range of materials that can be used. Moreover, the only successful examples of construction in action to date have occurred in controlled laboratory environments and, as real world trials have yet to be completed, it is yet to be seen whether it can be it equally proficient in practical situations. Key Words: 3D Printing; Contour Crafting; Concrete Printing; D-shape; Building Automation.
Resumo:
Terra Preta is a site-specific bio-energy project which aims to create a synergy between the public and the pre-existing engineered landscape of Freshkills Park on Staten Island, New York. The project challenges traditional paradigms of public space by proposing a dynamic and ever-changing landscape. The initiative allows the publuc to self-organise the landscape and to engage in 'algorithmic processes' of growth, harvest and space creation.
Resumo:
Objective The main aim of this study was to identify young drivers' underlying beliefs (i.e., behavioral, normative, and control) regarding initiating, monitoring/reading, and responding to social interactive technology (i.e., functions on a Smartphone that allow the user to communicate with other people). Method This qualitative study was a beliefs elicitation study in accordance with the Theory of Planned Behavior and sought to elicit young drivers' behavioral (i.e., advantages, disadvantages), normative (i.e., who approves, who disapproves), and control beliefs (i.e., barriers, facilitators) which underpin social interactive technology use while driving. Young drivers (N = 26) aged 17 to 25 years took part in an interview or focus group discussion. Results While differences emerged between the three behaviors of initiating, monitoring/reading, and responding for each of the behavioral, normative, and control belief categories, the strongest distinction was within the behavioral beliefs category (e.g., communicating with the person that they were on the way to meet was an advantage of initiating; being able to determine whether to respond was an advantage of monitoring/reading; and communicating with important people was an advantage of responding). Normative beliefs were similar for initiating and responding behaviors (e.g., friends and peers more likely to approve than other groups) and differences emerged for monitoring/reading (e.g., parents were more likely to approve of this behavior than initiating and responding). For control beliefs, there were differences between the beliefs regarding facilitators of these behaviors (e.g., familiar roads and conditions facilitated initiating; having audible notifications of an incoming communication facilitated monitoring/reading; and receiving a communication of immediate importance facilitated responding); however, the control beliefs that presented barriers were consistent across the three behaviors (e.g., difficult traffic/road conditions). Conclusion The current study provides an important addition to the extant literature and supports emerging research which suggests initiating, monitoring/reading, and responding may indeed be distinct behaviors with different underlying motivations.
Resumo:
In vitro studies and mathematical models are now being widely used to study the underlying mechanisms driving the expansion of cell colonies. This can improve our understanding of cancer formation and progression. Although much progress has been made in terms of developing and analysing mathematical models, far less progress has been made in terms of understanding how to estimate model parameters using experimental in vitro image-based data. To address this issue, a new approximate Bayesian computation (ABC) algorithm is proposed to estimate key parameters governing the expansion of melanoma cell (MM127) colonies, including cell diffusivity, D, cell proliferation rate, λ, and cell-to-cell adhesion, q, in two experimental scenarios, namely with and without a chemical treatment to suppress cell proliferation. Even when little prior biological knowledge about the parameters is assumed, all parameters are precisely inferred with a small posterior coefficient of variation, approximately 2–12%. The ABC analyses reveal that the posterior distributions of D and q depend on the experimental elapsed time, whereas the posterior distribution of λ does not. The posterior mean values of D and q are in the ranges 226–268 µm2h−1, 311–351 µm2h−1 and 0.23–0.39, 0.32–0.61 for the experimental periods of 0–24 h and 24–48 h, respectively. Furthermore, we found that the posterior distribution of q also depends on the initial cell density, whereas the posterior distributions of D and λ do not. The ABC approach also enables information from the two experiments to be combined, resulting in greater precision for all estimates of D and λ.
Resumo:
In this paper, we propose a highly reliable fault diagnosis scheme for incipient low-speed rolling element bearing failures. The scheme consists of fault feature calculation, discriminative fault feature analysis, and fault classification. The proposed approach first computes wavelet-based fault features, including the respective relative wavelet packet node energy and entropy, by applying a wavelet packet transform to an incoming acoustic emission signal. The most discriminative fault features are then filtered from the originally produced feature vector by using discriminative fault feature analysis based on a binary bat algorithm (BBA). Finally, the proposed approach employs one-against-all multiclass support vector machines to identify multiple low-speed rolling element bearing defects. This study compares the proposed BBA-based dimensionality reduction scheme with four other dimensionality reduction methodologies in terms of classification performance. Experimental results show that the proposed methodology is superior to other dimensionality reduction approaches, yielding an average classification accuracy of 94.9%, 95.8%, and 98.4% under bearing rotational speeds at 20 revolutions-per-minute (RPM), 80 RPM, and 140 RPM, respectively.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
Purpose The role of fine lactose in the dispersion of salmeterol xinafoate (SX) from lactose mixtures was studied by modifying the fine lactose concentration on the surface of the lactose carriers using wet decantation. Methods Fine lactose was removed from lactose carriers by wet decantation using ethanol saturated with lactose. Particle sizing was achieved by laser diffraction. Fine particle fractions (FPFs) were determined by Twin Stage Impinger using a 2.5% SX mixture, and SX was analyzed by a validated high-performance liquid chromatography method. Adhesion forces between probes of SX and silica and the lactose surfaces were determined by atomic force microscopy. Results FPFs of SX were related to fine lactose concentration in the mixture for inhalation grade lactose samples. Reductions in FPF (2-4-fold) of Aeroflo 95 and 65 were observed after removing fine lactose by wet decantation; FPFs reverted to original values after addition of micronized lactose to decanted mixtures. FPFs of SX of sieved and decanted fractions of Aeroflo carriers were significantly different (p < 0.001). The relationship between FPF and fine lactose concentration was linear. Decanted lactose demonstrated surface modification through increased SX-lactose adhesion forces; however, any surface modification other than removal of fine lactose only slightly influenced FPF. Conclusions Fine lactose played a key and dominating role in controlling FPF. SX to fine lactose ratios influenced dispersion of SX with maximum dispersion occurring as the ratio approached unity.
Resumo:
This paper discusses three different ways of applying the single-objective binary genetic algorithm into designing the wind farm. The introduction of different applications is through altering the binary encoding methods in GA codes. The first encoding method is the traditional one with fixed wind turbine positions. The second involves varying the initial positions from results of the first method, and it is achieved by using binary digits to represent the coordination of wind turbine on X or Y axis. The third is the mixing of the first encoding method with another one, which is by adding four more binary digits to represent one of the unavailable plots. The goal of this paper is to demonstrate how the single-objective binary algorithm can be applied and how the wind turbines are distributed under various conditions with best fitness. The main emphasis of discussion is focused on the scenario of wind direction varying from 0° to 45°. Results show that choosing the appropriate position of wind turbines is more significant than choosing the wind turbine numbers, considering that the former has a bigger influence on the whole farm fitness than the latter. And the farm has best performance of fitness values, farm efficiency, and total power with the direction between 20°to 30°.
Resumo:
Structural identification (St-Id) can be considered as the process of updating a finite element (FE) model of a structural system to match the measured response of the structure. This paper presents the St-Id of a laboratory-based steel through-truss cantilevered bridge with suspended span. There are a total of 600 degrees of freedom (DOFs) in the superstructure plus additional DOFs in the substructure. The St-Id of the bridge model used the modal parameters from a preliminary modal test in the objective function of a global optimisation technique using a layered genetic algorithm with patternsearch step (GAPS). Each layer of the St-Id process involved grouping of the structural parameters into a number of updating parameters and running parallel optimisations. The number of updating parameters was increased at each layer of the process. In order to accelerate the optimisation and ensure improved diversity within the population, a patternsearch step was applied to the fittest individuals at the end of each generation of the GA. The GAPS process was able to replicate the mode shapes for the first two lateral sway modes and the first vertical bending mode to a high degree of accuracy and, to a lesser degree, the mode shape of the first lateral bending mode. The mode shape and frequency of the torsional mode did not match very well. The frequencies of the first lateral bending mode, the first longitudinal mode and the first vertical mode matched very well. The frequency of the first sway mode was lower and that of the second sway mode was higher than the true values, indicating a possible problem with the FE model. Improvements to the model and the St-Id process will be presented at the upcoming conference and compared to the results presented in this paper. These improvements will include the use of multiple FE models in a multi-layered, multi-solution, GAPS St-Id approach.
Resumo:
The Bruneau-Jarbidge eruptive center (BJEC) in the central Snake River Plain, Idaho, USA consists of the Cougar Point Tuff (CPT), a series of ten, high-temperature (900-1000°C) voluminous ignimbrites produced over the explosive phase of volcanism (12.8-10.5 Ma) and more than a dozen equally high-temperature rhyolite lava flows produced during the effusive phase (10.5-8 Ma). Spot analyses by ion microprobe of oxygen isotope ratios in 210 zircons demonstrate that all of the eruptive units of the BJEC are characterized by zircon δ¹⁸O values ≤ 2.5‰, thus documenting the largest low δ¹⁸O silicic volcanic province known on Earth (>10⁴ km³). There is no evidence for voluminous normal δ¹⁸O magmatism at the BJEC that precedes generation of low δ¹⁸O magmas as there is at other volcanic centers that generate low δ¹⁸O magmas such as Heise and Yellowstone. At these younger volcanic centers of the hotspot track, such low δ¹⁸O magmas represent ~45 % and ~20% respectively of total eruptive volumes. Zircons in all BJEC tuffs and lavas studied (23 units) document strong δ¹⁸O depletion (median CPT δ¹⁸OZrc = 1.0‰, post-CPT lavas = 1.5‰) with the third member of the CPT recording an excursion to minimum δ¹⁸O values (δ¹⁸OZrc= -1.8‰) in a supereruption > 2‰ lower than other voluminous low δ¹⁸O rhyolites known worldwide (δ¹⁸OWR ≤0.9 vs. 3.4‰). Subsequent units of the CPT and lavas record a progressive recovery in δ¹⁸OZrc to ~2.5‰ over a ~ 4 m.y. interval (12 to 8 Ma). We present detailed evidence of unit-to-unit systematic patterns in O isotopic zoning in zircons (i.e. direction and magnitude of Δcore-rim), spectrum of δ¹⁸O in individual units, and zircon inheritance patterns established by re-analysis of spots for U-Th-Pb isotopes by LA-ICPMS and SHRIMP. In conjunction with mineral thermometry and magma compositions, these patterns are difficult to reconcile with the well-established model for "cannibalistic" low δ¹⁸O magma genesis at Heise and Yellowstone. We present an alternative model for the central Snake River Plain using the modeling results of Leeman et al. (2008) for ¹⁸O depletion as a function of depth in a mid-upper crustal protolith that was hydrothermally altered by infiltrating meteoric waters prior to the onset of silicic magmatism. The model proposes that BJEC silicic magmas were generated in response to the propagation of a melting front, driven by the incremental growth of a vast underlying mafic sill complex, over a ~5 m.y. interval through a crustal volume in which a vertically asymmetric δ¹⁸OWR gradient had previously developed that was sharply inflected from ~ -1 to 10‰ at mid-upper crustal depths. Within the context of the model, data from BJEC zircons are consistent with incremental melting and mixing events in roof zones of magma reservoirs that accompany surfaceward advance of the coupled mafic-silicic magmatic system.
Resumo:
In this paper, we develop and validate a new Statistically Assisted Fluid Registration Algorithm (SAFIRA) for brain images. A non-statistical version of this algorithm was first implemented in [2] and re-formulated using Lagrangian mechanics in [3]. Here we extend this algorithm to 3D: given 3D brain images from a population, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the non-statistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the regularizing (i.e., the non-conservative Lagrangian) terms, creating four versions of the algorithm. We evaluated the accuracy of each algorithm variant using the manually labeled LPBA40 dataset, which provides us with ground truth anatomical segmentations. We also compared the power of the different algorithms using tensor-based morphometry -a technique to analyze local volumetric differences in brain structure- applied to 46 3D brain scans from healthy monozygotic twins.
Resumo:
The increasing integration of Renewable Energy Resources (RER) and the role of Electric Energy Storage (EES) in distribution systems has created interest in using energy management strategies. EES has become a suitable resource to manage energy consumption and generation in smart grid. Optimize scheduling of EES can also maximize retailer’s profit by introducing energy time-shift opportunities. This paper proposes a new strategy for scheduling EES in order to reduce the impact of electricity market price and load uncertainty on retailers’ profit. The proposed strategy optimizes the cost of purchasing energy with the objective of minimizing surplus energy cost in hedging contract. A case study is provided to demonstrate the impact of the proposed strategy on retailers’ financial benefit.
Resumo:
The development of Electric Energy Storage (EES) integrated with Renewable Energy Resources (RER) has increased use of optimum scheduling strategy in distribution systems. Optimum scheduling of EES can reduce cost of purchased energy by retailers while improve the reliability of customers in distribution system. This paper proposes an optimum scheduling strategy for EES and the evaluation of its impact on reliability of distribution system. Case study shows the impact of the proposed strategy on reliability indices of a distribution system.
Resumo:
Purpose The purpose of this study is to compare quality perceptions of virtual servicescapes and physical service encounters among buyers and renters of real estate. Design/methodology/approach Qualitative data from a sample of 27 professionals engaged in higher education in the USA are gathered by recorded interview before being transcribed and imported into MAXQDA 2007 software for analytical coding. Findings Particular differences are found to exist between renters and buyers with regard to specific service attributes – for example, description of properties and type of visuals during the pre‐purchase stage, knowledge/experience and honest behavior of realtors during the service encounter stage and a continuous relationship with the realtor in the post‐encounter stage. Research limitations/implications Generalization of the results is limited because the study utilizes data from only one industry (real estate) and from only one demographic segment (professionals in higher education). Practical implications Real‐estate firms need to pay attention to both the training of agents and the design and content of their websites. Originality/value This paper contributes to knowledge regarding virtual servicescapes in professional services.