54 resultados para Header
Resumo:
The University's thesis regulations give very specific guidance about margin's and page layout. This manual explains how to use set up margins to work with double sided printing; how to ensure chapters start on the right-hand page; and how to create running headers and footers in a thesis written in MS Word 2011.
Resumo:
The University's thesis regulations give very specific guidance about margin's and page layout. This manual explains how to use set up margins to work with double sided printing; how to ensure chapters start on the right-hand page; and how to create running headers and footers in a thesis written in MS Word 2010.
Resumo:
This video shows how to create a Landscape Section in a Word 2010 file. Users of Word 2013 will find the steps almost identical. Users of Word 2011 for Mac will find this useful, steps can be found in the Word 2011 Sections manual. See how to: • Create a Landscape section • Set up margins for the Landscape section • Create Headers and Footers that retain the orientation of the text in Portrait section
Resumo:
The University's thesis regulations give very specific guidance about margin's and page layout. This manual explains how to use set up margins to work with double sided printing; how to ensure chapters start on the right-hand page; and how to create running headers and footers in a thesis written in MS Word 2013.
Resumo:
The Bloom filter is a space efficient randomized data structure for representing a set and supporting membership queries. Bloom filters intrinsically allow false positives. However, the space savings they offer outweigh the disadvantage if the false positive rates are kept sufficiently low. Inspired by the recent application of the Bloom filter in a novel multicast forwarding fabric, this paper proposes a variant of the Bloom filter, the optihash. The optihash introduces an optimization for the false positive rate at the stage of Bloom filter formation using the same amount of space at the cost of slightly more processing than the classic Bloom filter. Often Bloom filters are used in situations where a fixed amount of space is a primary constraint. We present the optihash as a good alternative to Bloom filters since the amount of space is the same and the improvements in false positives can justify the additional processing. Specifically, we show via simulations and numerical analysis that using the optihash the false positives occurrences can be reduced and controlled at a cost of small additional processing. The simulations are carried out for in-packet forwarding. In this framework, the Bloom filter is used as a compact link/route identifier and it is placed in the packet header to encode the route. At each node, the Bloom filter is queried for membership in order to make forwarding decisions. A false positive in the forwarding decision is translated into packets forwarded along an unintended outgoing link. By using the optihash, false positives can be reduced. The optimization processing is carried out in an entity termed the Topology Manger which is part of the control plane of the multicast forwarding fabric. This processing is only carried out on a per-session basis, not for every packet. The aim of this paper is to present the optihash and evaluate its false positive performances via simulations in order to measure the influence of different parameters on the false positive rate. The false positive rate for the optihash is then compared with the false positive probability of the classic Bloom filter.
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
Pós-graduação em Agronomia (Ciência do Solo) - FCAV
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
The Internet of Things is a new paradigm where smart embedded devices and systems are connected to the Internet. In this context, Wireless Sensor Networks (WSN) are becoming an important alternative for sensing and actuating critical applications like industrial automation, remote patient monitoring and domotics. The IEEE 802.15.4 protocol has been adopted as a standard for WSN and the 6LoWPAN protocol has been proposed to overcome the challenges of integrating WSN and Internet protocols. In this paper, the mechanisms of header compression and fragmentation of IPv6 datagrams proposed in the 6LoWPAN standard were evaluated through field experiments using a gateway prototype and IEEE 802.15.4 nodes.
Resumo:
Environmental decay in porous masonry materials, such as brick and mortar, is a widespread problem concerning both new and historic masonry structures. The decay mechanisms are quite complex dependng upon several interconnected parameters and from the interaction with the specific micro-climate. Materials undergo aesthetical and substantial changes in character but while many studies have been carried out, the mechanical aspect has been largely understudied while it bears true importance from the structural viewpoint. A quantitative assessment of the masonry material degradation and how it affects the load-bearing capacity of masonry structures appears missing. The research work carried out, limiting the attention to brick masonry addresses this issue through an experimental laboratory approach via different integrated testing procedures, both non-destructive and mechanical, together with monitoring methods. Attention was focused on transport of moisture and salts and on the damaging effects caused by the crystallization of two different salts, sodium chloride and sodium sulphate. Many series of masonry specimens, very different in size and purposes were used to track the damage process since its beginning and to monitor its evolution over a number of years Athe same time suitable testing techniques, non-destructive, mini-invasive, analytical, of monitoring, were validated for these purposes. The specimens were exposed to different aggressive agents (in terms of type of salt, of brine concentration, of artificial vs. open-air natural ageing, …), tested by different means (qualitative vs. quantitative, non destructive vs. mechanical testing, punctual vs. wide areas, …), and had different size (1-, 2-, 3-header thick walls, full-scale walls vs. small size specimens, brick columns and triplets vs. small walls, masonry specimens vs. single units of brick and mortar prisms, …). Different advanced testing methods and novel monitoring techniques were applied in an integrated holistic approach, for quantitative assessment of masonry health state.
Resumo:
A vast amount of temporal information is provided on the Web. Even though many facts expressed in documents are time-related, the temporal properties of Web presentations have not received much attention. In database research, temporal databases have become a mainstream topic in recent years. In Web documents, temporal data may exist as meta data in the header and as user-directed data in the body of a document. Whereas temporal data can easily be identified in the semi-structured meta data, it is more difficult to determine temporal data and its role in the body. We propose procedures for maintaining temporal integrity of Web pages and outline different approaches of applying bitemporal data concepts for Web documents. In particular, we regard desirable functionalities of Web repositories and other Web-related tools that may support the Webmasters in managing the temporal data of their Web documents. Some properties of a prototype environment are described.
Resumo:
Paleomagnetic measurements of sediment samples provide the magnetostratigraphy at Deep Sea Drilling Sites 582, 583, and 584 in the Nankai Trough and the Japan Trench. Drastic changes in the rate of sediment accumulation are documented by the magnetostratigraphic and biostratigraphic correlations. The changes in the accumulation rate correspond to the supply of sediments and variations in the accretionary process, which are directly related to the tectonic cycles in the geologic evolution of the Japanese island arc. Faults and folds within the drilled sedimentary sequences are oriented by paleomagnetic declination. Their directions and stress patterns are related to the relative plate motion along the trough and trench. The original remanent magnetization of the sediment was modified and remagnetized in the tectonic process of accretion by physical deformation, faulting, and intrusion of dewatering veinlets.
Resumo:
A detailed study has been made of the physical properties of core samples from Deep Sea Drilling Project Hole 395A. The properties include: density, porosity, compressional and shear wave velocity, thermal conductivity, thermal diffusivity, and electrical resistivity. Of particular importance are the relations among the parameters. Most of the variations in the basalt properties follow the porosity, with smaller inferred dependence on pore structure, original mineralogy differences, and alteration. The sample measurements give very similar results to (and extend previous data from) Mid-Atlantic Ridge drillholes, the sample data from this site and previous data are used to estimate relations between porosity and other large-scale physical properties of the upper oceanic crust applicable to this area. These relations are important for the analysis and interpretation of downhole logging measurements and marine geophysical data.
Population genetic and dispersal modeling data for Bathymodiolus mussels from the Mid-Atlantic Ridge
Resumo:
The zip folder comprises a text file and a gzipped tar archive. 1) The text file contains individual genotype data for 90 SNPs, 9 microsatellites and the mitochondrial ND4 gene that were determined in deep-sea hydrothermal vent mussels from the Mid-Atlantic Ridge (genus Bathymodiolus). Mussel specimens are grouped according to the population (pop)/location from which they have been sampled (first column). The remaining columns contain the respective allele/haplotype codes for the different genetic loci (names in the header line). The data file is in CONVERT format and can be directly transformed into different input files for population genetic statistics. 2) The tar archive contains NetCDF files with larval dispersal probabilities for simulated annual larval releases between 1998 and 2007. For each simulated vent location (Menez Gwen, Lucky Strike, Rainbow, Vent 1-10) two NetCDF files are given, one for an assumed pelagic larval duration of 1 year and the other one for an assumed pelagic larval duration of 6 months (6m).