981 resultados para Log cabins.
Resumo:
Different seismic hazard components pertaining to Bangalore city,namely soil overburden thickness, effective shear-wave velocity, factor of safety against liquefaction potential, peak ground acceleration at the seismic bedrock, site response in terms of amplification factor, and the predominant frequency, has been individually evaluated. The overburden thickness distribution, predominantly in the range of 5-10 m in the city, has been estimated through a sub-surface model from geotechnical bore-log data. The effective shear-wave velocity distribution, established through Multi-channel Analysis of Surface Wave (MASW) survey and subsequent data interpretation through dispersion analysis, exhibits site class D (180-360 m/s), site class C (360-760 m/s), and site class B (760-1500 m/s) in compliance to the National Earthquake Hazard Reduction Program (NEHRP) nomenclature. The peak ground acceleration has been estimated through deterministic approach, based on the maximum credible earthquake of M-W = 5.1 assumed to be nucleating from the closest active seismic source (Mandya-Channapatna-Bangalore Lineament). The 1-D site response factor, computed at each borehole through geotechnical analysis across the study region, is seen to be ranging from around amplification of one to as high as four times. Correspondingly, the predominant frequency estimated from the Fourier spectrum is found to be predominantly in range of 3.5-5.0 Hz. The soil liquefaction hazard assessment has been estimated in terms of factor of safety against liquefaction potential using standard penetration test data and the underlying soil properties that indicates 90% of the study region to be non-liquefiable. The spatial distributions of the different hazard entities are placed on a GIS platform and subsequently, integrated through analytical hierarchal process. The accomplished deterministic hazard map shows high hazard coverage in the western areas. The microzonation, thus, achieved is envisaged as a first-cut assessment of the site specific hazard in laying out a framework for higher order seismic microzonation as well as a useful decision support tool in overall land-use planning, and hazard management. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
An atmospheric radio noise burst represents the radiation received from one complete lightning flash at the frequency to which a receiver is tuned and within the receiver bandwidth. At tropical latitudes, the principal source of interference in the frequency range from 0.1 to 10 MHz is the burst form of atmospheric radio noise. The structure of a burst shows several approximately rectangular pulses of random amplitude, duration and frequency of recurrence. The influence of the noise on data communication can only be examined when the value of the number of pulses crossing a certain amplitude threshold per unit time of the noise burst is known. A pulse rate counter designed for this purpose has been used at Bangalore (12°58′N, 77°35′E) to investigate the pulse characteristics of noise bursts at 3 MHz with a receiver bandwidth of 3.3 kHz/6d B. The results show that the number of pulses lying in the amplitude range between peak and quasi-peak values of the noise bursts and the burst duration corresponding to these pulses follow log normal distributions. The pulse rates deduced therefrom show certain correlation between the number of pulses and the duration of the noise burst. The results are discussed with a view to furnish necessary information for data communication.
Resumo:
Magnetic susceptibility measurements were performed on freshly fallen Almahata Sitta meteorites. Most recovered samples are polymict ureilites. Those found in the first four months since impact, before the meteorites were exposed to rain, have a magnetic susceptibility in the narrow range of 4.92 ± 0.08 log 10-9 Am2/kg close to the range of other ureilite falls 4.95 ± 0.14 log 10-9 Am2/kg reported by Rochette et al. (2009). The Almahata Sitta samples collected one year after the fall have similar values (4.90 ± 0.06 log 10-9 Am2/kg), revealing that the effect of one-year of terrestrial weathering was not severe yet. However, our reported values are higher than derived from polymict (brecciated) ureilites 4.38 ± 0.47 log 10-9 Am2/kg (Rochette et al. 2009) containing both falls and finds confirming that these are significantly weathered. Additionally other fresh-looking meteorites of non-ureilitic compositions were collected in the Almahata Sitta strewn field. Magnetic susceptibility measurements proved to be a convenient non-destructive method for identifying non-ureilitic meteorites among those collected in the Almahata Sitta strewn field, even among fully crusted. Three such meteorites, no. 16, 25, and 41, were analyzed and their composition determined as EH6, H5 and EL6 respectively (Zolensky et al., 2010). A high scatter of magnetic susceptibility values among small (< 5 g) samples revealed high inhomogeneity within the 2008 TC3 material at scales below 1-2 cm.
Resumo:
Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.
Resumo:
This article reports on a cross-sectional case study of a large construction project in which Electronic document management (EDM) was used. Attitudes towards EDM from the perspective of individual end users were investigated. Responses from a survey were combined with data from system usage log files to obtain an overview of attitudes prevalent in different user segments of the total population of 334 users. The survey was followed by semi-structured interviews with representative users. A strong majority of users from all segments of the project group considered EDM as a valuable aid in their work processes, despite certain functional limitations of the system used and the complexity of the information mass. Based on the study a model describing the key factors affecting end user EDM adoption is proposed. The model draws on insight from earlier studies of EDM enabled projects and theoretical frameworks on technology acceptance and success of information systems, as well as the insights gained from the case study.
Resumo:
Triggered by the very quick proliferation of Internet connectivity, electronic document management (EDM) systems are now rapidly being adopted for managing the documentation that is produced and exchanged in construction projects. Nevertheless there are still substantial barriers to the efficient use of such systems, mainly of a psychological nature and related to insufficient training. This paper presents the results of empirical studies carried out during 2002 concerning the current usage of EDM systems in the Finnish construction industry. The studies employed three different methods in order to provide a multifaceted view of the problem area, both on the industry and individual project level. In order to provide an accurate measurement of overall usage volume in the industry as a whole telephone interviews with key personnel from 100 randomly chosen construction projects were conducted. The interviews showed that while around 1/3 of big projects already have adopted the use of EDM, very few small projects have adopted this technology. The barriers to introduction were investigated through interviews with representatives for half a dozen of providers of systems and ASP-services. These interviews shed a lot of light on the dynamics of the market for this type of services and illustrated the diversity of business strategies adopted by vendors. In the final study log files from a project which had used an EDM system were analysed in order to determine usage patterns. The results illustrated that use is yet incomplete in coverage and that only a part of the individuals involved in the project used the system efficiently, either as information producers or consumers. The study also provided feedback on the usefulness of the log files.
Resumo:
We develop an alternate characterization of the statistical distribution of the inter-cell interference power observed in the uplink of CDMA systems. We show that the lognormal distribution better matches the cumulative distribution and complementary cumulative distribution functions of the uplink interference than the conventionally assumed Gaussian distribution and variants based on it. This is in spite of the fact that many users together contribute to uplink interference, with the number of users and their locations both being random. Our observations hold even in the presence of power control and cell selection, which have hitherto been used to justify the Gaussian distribution approximation. The parameters of the lognormal are obtained by matching moments, for which detailed analytical expressions that incorporate wireless propagation, cellular layout, power control, and cell selection parameters are developed. The moment-matched lognormal model, while not perfect, is an order of magnitude better in modeling the interference power distribution.
Resumo:
The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.
Resumo:
A linear time approximate maximum likelihood decoding algorithm on tail-biting trellises is presented, that requires exactly two rounds on the trellis. This is an adaptation of an algorithm proposed earlier with the advantage that it reduces the time complexity from O(m log m) to O(m) where m is the number of nodes in the tail-biting trellis. A necessary condition for the output of the algorithm to differ from the output of the ideal ML decoder is deduced and simulation results on an AWGN channel using tail-biting trellises for two rate 1/2 convolutional codes with memory 4 and 6 respectively, are reported.
Resumo:
We propose two texture-based approaches, one involving Gabor filters and the other employing log-polar wavelets, for separating text from non-text elements in a document image. Both the proposed algorithms compute local energy at some information-rich points, which are marked by Harris' corner detector. The advantage of this approach is that the algorithm calculates the local energy at selected points and not throughout the image, thus saving a lot of computational time. The algorithm has been tested on a large set of scanned text pages and the results have been seen to be better than the results from the existing algorithms. Among the proposed schemes, the Gabor filter based scheme marginally outperforms the wavelet based scheme.
Resumo:
Tanner Graph representation of linear block codes is widely used by iterative decoding algorithms for recovering data transmitted across a noisy communication channel from errors and erasures introduced by the channel. The stopping distance of a Tanner graph T for a binary linear block code C determines the number of erasures correctable using iterative decoding on the Tanner graph T when data is transmitted across a binary erasure channel using the code C. We show that the problem of finding the stopping distance of a Tanner graph is hard to approximate within any positive constant approximation ratio in polynomial time unless P = NP. It is also shown as a consequence that there can be no approximation algorithm for the problem achieving an approximation ratio of 2(log n)(1-epsilon) for any epsilon > 0 unless NP subset of DTIME(n(poly(log n))).
Resumo:
Extraction of text areas from the document images with complex content and layout is one of the challenging tasks. Few texture based techniques have already been proposed for extraction of such text blocks. Most of such techniques are greedy for computation time and hence are far from being realizable for real time implementation. In this work, we propose a modification to two of the existing texture based techniques to reduce the computation. This is accomplished with Harris corner detectors. The efficiency of these two textures based algorithms, one based on Gabor filters and other on log-polar wavelet signature, are compared. A combination of Gabor feature based texture classification performed on a smaller set of Harris corner detected points is observed to deliver the accuracy and efficiency.
Resumo:
Template matching is concerned with measuring the similarity between patterns of two objects. This paper proposes a memory-based reasoning approach for pattern recognition of binary images with a large template set. It seems that memory-based reasoning intrinsically requires a large database. Moreover, some binary image recognition problems inherently need large template sets, such as the recognition of Chinese characters which needs thousands of templates. The proposed algorithm is based on the Connection Machine, which is the most massively parallel machine to date, using a multiresolution method to search for the matching template. The approach uses the pyramid data structure for the multiresolution representation of templates and the input image pattern. For a given binary image it scans the template pyramid searching the match. A binary image of N × N pixels can be matched in O(log N) time complexity by our algorithm and is independent of the number of templates. Implementation of the proposed scheme is described in detail.
Resumo:
The max-coloring problem is to compute a legal coloring of the vertices of a graph G = (V, E) with a non-negative weight function w on V such that Sigma(k)(i=1) max(v epsilon Ci) w(v(i)) is minimized, where C-1, ... , C-k are the various color classes. Max-coloring general graphs is as hard as the classical vertex coloring problem, a special case where vertices have unit weight. In fact, in some cases it can even be harder: for example, no polynomial time algorithm is known for max-coloring trees. In this paper we consider the problem of max-coloring paths and its generalization, max-coloring abroad class of trees and show it can be solved in time O(vertical bar V vertical bar+time for sorting the vertex weights). When vertex weights belong to R, we show a matching lower bound of Omega(vertical bar V vertical bar log vertical bar V vertical bar) in the algebraic computation tree model.
Resumo:
Candida species are an important cause of nosocomial bloodstream infections in hospitalized patients worldwide, with associated high mortality, excess length of stay and costs. Main contributors to candidemias is profound immunosuppression due to serious underlying condition or intensive treatments leading to an increasing number of susceptible patients. The rank order of causative Candida species varies over time and in different geographic locations. The aim of this study was to obtain information on epidemiology of candidemia in Finland, to identify trends in incidence, causative species, and patient populations at risk. In order to reveal possible outbreaks and assess the value of one molecular typing method, restriction enzyme analysis (REA), in epidemiological study, we analyzed C. albicans bloodstream isolates in Uusimaa region in Southern Finland during eight years. The data from the National Infectious Disease Register were used to assess the incidence and epidemiological features of candidemia cases. In Helsinki University Central Hospital (HUCH) all patients with blood culture yielding any Candida spp. were identified from laboratory log-books and from Finnish Hospital Infection Program. All the patients with a stored blood culture isolate of C. albicans were identified through microbiology laboratory logbooks, and stored isolates were genotyped with REA in the National Institute for Health and Welfare (former KTL). The incidence of candidemia in Finland is globally relatively low, but increased between between 1990s and 2000s. The incidence was highest in males >65 years of age, but incidence rates for patients <1-15 years were lower during 2000s than during 1990s. In HUCH the incidence of candidemia remained low and constant during our 18 years of observation, but a significant shift in patient-populations at risk was observed, associated with patients treated in intensive care units, such as premature neonates and surgical patients. The predominating causative species in Finland and in HUCH is C. albicans, but the proportion of C. glabrata increased considerably. The crude one-month case fatality was constantly high between 28-33%. REA differentiated efficiently between C. albicans blood culture isolates and no clusters were observed in the hospitals involved, despite of abundant transfer of patients among them. Candida spp. are an important cause of nosocomial blood stream infections in Finland, and continued surveillance is necessary to determine the overall trends and patient groups at risk, and reduce the impact of these infections in the future. Molecular methods provide an efficient tool for investigation of suspected outbreak and should be available in the future in Finland, also.