996 resultados para Mixture design
Resumo:
The beta-strand conformation is unknown for short peptides in aqueous solution, yet it is a fundamental building block in proteins and the crucial recognition motif for proteolytic enzymes that enable formation and turnover of all proteins. To create a generalized scaffold as a peptidomimetic that is preorganized in a beta-strand, we individually synthesized a series of 15-22-membered macrocyclic analogues of tripeptides and analyzed their structures. Each cycle is highly constrained by two trans amide bonds and a planar aromatic ring with a short nonpeptidic linker between them. A measure of this ring strain is the restricted rotation of the component tyrosinyl aromatic ring (DeltaG(rot) 76.7 kJ mol(-1) (16-membered ring), 46.1 kJ mol(-1) (17-membered ring)) evidenced by variable temperature proton NMR spectra (DMF-d(7), 200-400 K). Unusually large amide coupling constants ((3)J(NH-CHalpha) 9-10 Hz) corresponding to large dihedral angles were detected in both protic and aprotic solvents for these macrocycles, consistent with a high degree of structure in solution. The temperature dependence of all amide NH chemical shifts (Deltadelta/T7-12 ppb/deg) precluded the presence of transannular hydrogen bonds that define alternative turn structures. Whereas similar sized conventional cyclic peptides usually exist in solution as an equilibrium mixture of multiple conformers, these macrocycles adopt a well-defined beta-strand structure even in water as revealed by 2-D NMR spectral data and by a structure calculation for the smallest (15-membered) and most constrained macrocycle. Macrocycles that are sufficiently constrained to exclusively adopt a beta-strand-mimicking structure in water may be useful pre-organized and generic templates for the design of compounds that interfere with beta-strand recognition in biology.
Resumo:
Using benthic habitat data from the Florida Keys (USA), we demonstrate how siting algorithms can help identify potential networks of marine reserves that comprehensively represent target habitat types. We applied a flexible optimization tool-simulated annealing-to represent a fixed proportion of different marine habitat types within a geographic area. We investigated the relative influence of spatial information, planning-unit size, detail of habitat classification, and magnitude of the overall conservation goal on the resulting network scenarios. With this method, we were able to identify many adequate reserve systems that met the conservation goals, e.g., representing at least 20% of each conservation target (i.e., habitat type) while fulfilling the overall aim of minimizing the system area and perimeter. One of the most useful types of information provided by this siting algorithm comes from an irreplaceability analysis, which is a count of the number of, times unique planning units were included in reserve system scenarios. This analysis indicated that many different combinations of sites produced networks that met the conservation goals. While individual 1-km(2) areas were fairly interchangeable, the irreplaceability analysis highlighted larger areas within the planning region that were chosen consistently to meet the goals incorporated into the algorithm. Additionally, we found that reserve systems designed with a high degree of spatial clustering tended to have considerably less perimeter and larger overall areas in reserve-a configuration that may be preferable particularly for sociopolitical reasons. This exercise illustrates the value of using the simulated annealing algorithm to help site marine reserves: the approach makes efficient use of;available resources, can be used interactively by conservation decision makers, and offers biologically suitable alternative networks from which an effective system of marine reserves can be crafted.
Resumo:
Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.
Resumo:
In spite of their wide application in comminution circuits, hydrocyclones have at least one significant disadvantage in that their operation inherently tends to return the fine denser liberated minerals to the grinding mill. This results in unnecessary overgrinding which adds to the milling cost and can adversely affect the efficiency of downstream processes. In an attempt to solve this problem, a three-product cyclone has been developed at the Julius Kruttschnitt Mineral Research Centre (JKMRC) to generate a second overflow in which the fine dense liberated minerals can be selectively concentrated for further treatment. In this paper, the design and operation of the three-product cyclone are described. The influence of the length of the second vortex finder on the performance of a 150-mm unit treating a mixture of magnetite and silica is investigated. Conventional cyclone tests were also conducted under similar conditions. Using the operational performance data of the three-product and conventional cyclones, it is shown that by optimising the length of the second vortex finder, the amount of fine dense mineral particles that reports to the three-product cyclone underflow can be reduced. In addition, the three-product cyclone can be used to generate middlings stream that may be more suitable for flash flotation than the conventional cyclone underflow, or alternatively, could be classified with a microscreen to separate the valuables from the gangue. At the same time, a fines stream having similar properties to those of the conventional overflow can be obtained. Hence, if the middlings stream was used as feed for flash flotation or microscreening, the fines stream could be used in lieu of the conventional overflow without compromising the feed requirements for the conventional flotation circuit. Some of the other potential applications of the new cyclone are described. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult-there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of principles from the study of artifacts. This point was raised in Curriculum 2001 discussions, and debate needs to start in good time for the next curriculum standard. This paper provides a starting point for debate, by outlining a process by which principles and artifacts may be separated, and presents a sample curriculum to illustrate the possibilities. This sample curriculum has some positive points, though these positive points are incidental to the need to start debating the issue. Other models, with a less rigorous ordering of principles before artifacts, would still gain from making it clearer whether a specific concept was fundamental, or a property of a specific technology. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Measurement while drilling (MWD) techniques can provide a useful tool to aid drill and blast engineers in open cut mining. By avoiding time consuming tasks such as scan-lines and rock sample collection for laboratory tests, MWD techniques can not only save time but also improve the reliability of the blast design by providing the drill and blast engineer with the information specially tailored for use. While most mines use a standard blast pattern and charge per blasthole, based on a single rock factor for the entire bench or blast region, information derived from the MWD parameters can improve the blast design by providing more accurate rock properties for each individual blasthole. From this, decisions can be made on the most appropriate type and amount of explosive charge to place in a per blasthole or to optimise the inter-hole timing detonation time of different decks and blastholes. Where real-time calculations are feasible, the system could extend the present blast design even be used to determine the placement of subsequent holes towards a more appropriate blasthole pattern design like asymmetrical blasting.
Resumo:
Blasting has been the most frequently used method for rock breakage since black powder was first used to fragment rocks, more than two hundred years ago. This paper is an attempt to reassess standard design techniques used in blasting by providing an alternative approach to blast design. The new approach has been termed asymmetric blasting. Based on providing real time rock recognition through the capacity of measurement while drilling (MWD) techniques, asymmetric blasting is an approach to deal with rock properties as they occur in nature, i.e., randomly and asymmetrically spatially distributed. It is well accepted that performance of basic mining operations, such as excavation and crushing rely on a broken rock mass which has been pre conditioned by the blast. By pre-conditioned we mean well fragmented, sufficiently loose and with adequate muckpile profile. These muckpile characteristics affect loading and hauling [1]. The influence of blasting does not end there. Under the Mine to Mill paradigm, blasting has a significant leverage on downstream operations such as crushing and milling. There is a body of evidence that blasting affects mineral liberation [2]. Thus, the importance of blasting has increased from simply fragmenting and loosing the rock mass, to a broader role that encompasses many aspects of mining, which affects the cost of the end product. A new approach is proposed in this paper which facilitates this trend 'to treat non-homogeneous media (rock mass) in a non-homogeneous manner (an asymmetrical pattern) in order to achieve an optimal result (in terms of muckpile size distribution).' It is postulated there are no logical reasons (besides the current lack of means to infer rock mass properties in the blind zones of the bench and onsite precedents) for drilling a regular blast pattern over a rock mass that is inherently heterogeneous. Real and theoretical examples of such a method are presented.