990 resultados para Architecture, Domestic -- Australia -- Designs and plans


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with two problems. The first is the determination of λ-designs, combinatorial configurations which are essentially symmetric block designs with the condition that each subset be of the same cardinality negated. We construct an infinite family of such designs from symmetric block designs and obtain some basic results about their structure. These results enable us to solve the problem for λ = 3 and λ = 4. The second problem deals with configurations related to both λ -designs and (ѵ, k, λ)-configurations. We have (n-1) k-subsets of {1, 2, ..., n}, S1, ..., Sn-1 such that Si ∩ Sj is a λ-set for i ≠ j. We obtain specifically the replication numbers of such a design in terms of n, k, and λ with one exceptional class which we determine explicitly. In certain special cases we settle the problem entirely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper advances the proposition that in many electronic products, the partitioning scheme adopted and the interconnection system used to interconnect the sub-assemblies or components are intimately related to the economic benefits, and hence the attractiveness, of reuse of these items. An architecture has been developed in which the residual values of the connectors, components and sub-assemblies are maximized, and opportunities for take-back and reuse of redundant items are greatly enhanced. The system described also offers significant manufacturing cost benefits in terms of ease of assembly, compactness and robustness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Domestic cats and dogs are important companion animals and model animals in biomedical research. The cat has a highly conserved karyotype, closely resembling the ancestral karyotype of mammals, while the dog has one of the most extensively rearranged mammalian karyotypes investigated so far. We have constructed the first detailed comparative chromosome map of the domestic dog and cat by reciprocal chromosome painting. Dog paints specific for the 38 autosomes and the X chromosomes delineated 68 conserved chromosomal segments in the cat, while reverse painting of cat probes onto red fox and dog chromosomes revealed 65 conserved segments. Most conserved segments on cat chromosomes also show a high degree of conservation in G-banding patterns compared with their canine counterparts. At least 47 chromosomal fissions (breaks), 25 fusions and one inversion are needed to convert the cat karyotype to that of the dog, confirming that extensive chromosome rearrangements differentiate the karyotypes of the cat and dog. Comparative analysis of the distribution patterns of conserved segments defined by dog paints on cat and human chromosomes has refined the human/cat comparative genome map and, most importantly, has revealed 15 cryptic inversions in seven large chromosomal regions of conserved synteny between humans and cats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have made a complete set of painting probes for the domestic horse by degenerate oligonucleotide-primed PCR amplification of flow-sorted horse chromosomes. The horse probes, together with a full set of those available for human, were hybridized onto metaphase chromosomes of human, horse and mule. Based on the hybridization results, we have generated genome-wide comparative chromosome maps involving the domestic horse, donkey and human. These maps define the overall distribution and boundaries of evolutionarily conserved chromosomal segments in the three genomes. Our results shed further light on the karyotypic relationships among these species and, in particular, the chromosomal rearrangements that underlie hybrid sterility and the occasional fertility of mules.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development ­Environments (DECADE). A brief discussion sets the background for IoT, and the development of the ­distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, ­local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and ­quantitative ­analysis ­carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service ­architecture, ­combining a distributed data warehouse, web services for analysis agents, ontology agents and a ­verification engine, with a centrally verified outcome database maintained by certifying body for qualification/­professional status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the experimental design. Our findings provide novel evidence to aid the design of future sampling programs and improve our general understanding of the mechanisms regulating elemental fingerprints.