240 resultados para Beam Search Method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

When used as floor joists, the new mono-symmetric LiteSteel beam (LSB) sections require web openings to provide access for inspections and various services. The LSBs consist of two rectangular hollow flanges connected by a slender web, and are subjected to lateral distortional buckling effects in the intermediate span range. Their member capacity design formulae developed to date are based on their elastic lateral buckling moments, and only limited research has been undertaken to predict the elastic lateral buckling moments of LSBs with web openings. This paper addresses this research gap by reporting the development of web opening modelling techniques based on an equivalent reduced web thickness concept and a numerical method for predicting the elastic buckling moments of LSBs with circular web openings. The proposed numerical method was based on a formulation of the total potential energy of LSBs with circular web openings. The accuracy of the proposed method’s use with the aforementioned modelling techniques was verified through comparison of its results with those of finite strip and finite element analyses of various LSBs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The LiteSteel Beam (LSB) is an innovative cold-formed steel hollow flange section. When used as floor joists, the LSB sections require holes in the web to provide access for various services. In this study a detailed investigation was undertaken into the elastic lateral distortional buckling behaviour of LSBs with circular web openings subjected to a uniform moment using finite element analysis. Validated ideal finite element models were used first to study the effect of web holes on their elastic lateral distortional buckling behaviour. An equivalent web thickness method was then proposed using four different equations for the elastic buckling analyses of LSBs with web holes. It was found that two of them could be successfully used with approximate numerical models based on solid web elements with an equivalent reduced thickness to predict the elastic lateral distortional buckling moments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Entity-oriented retrieval aims to return a list of relevant entities rather than documents to provide exact answers for user queries. The nature of entity-oriented retrieval requires identifying the semantic intent of user queries, i.e., understanding the semantic role of query terms and determining the semantic categories which indicate the class of target entities. Existing methods are not able to exploit the semantic intent by capturing the semantic relationship between terms in a query and in a document that contains entity related information. To improve the understanding of the semantic intent of user queries, we propose concept-based retrieval method that not only automatically identifies the semantic intent of user queries, i.e., Intent Type and Intent Modifier but introduces concepts represented by Wikipedia articles to user queries. We evaluate our proposed method on entity profile documents annotated by concepts from Wikipedia category and list structure. Empirical analysis reveals that the proposed method outperforms several state-of-the-art approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new cold-formed LiteSteel beam (LSB) sections have found increasing popularity in residential, industrial and commercial buildings due to their lightweight and cost-effectiveness. They have the beneficial characteristics of including torsionally rigid rectangular flanges combined with economical fabrication processes. Currently there is significant interest in using LSB sections as flexural members in floor joist systems. When used as floor joists, the LSB sections require holes in the web to provide access for inspection and various services. But there are no design methods that provide accurate predictions of the moment capacities of LSBs with web holes. In this study, the buckling and ultimate strength behaviour of LSB flexural members with web holes was investigated in detail by using a detailed parametric study based on finite element analyses with an aim to develop appropriate design rules and recommendations for the safe design of LSB floor joists. Moment capacity curves were obtained using finite element analyses including all the significant behavioural effects affecting their ultimate member capacity. The parametric study produced the required moment capacity curves of LSB section with a range of web hole combinations and spans. A suitable design method for predicting the ultimate moment capacity of LSB with web holes was finally developed. This paper presents the details of this investigation and the results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modal strain energy method, which depends on the vibration characteristics of the structure, has been reasonably successful in identifying and localising damage in the structure. However, existing strain energy methods require the first few modes to be measured to provide meaningful damage detection. Use of individual modes with existing strain energy methods may indicate false alarms or may not detect the damage at or near the nodal points. This paper proposes a new modal strain energy based damage index which can detect and localize the damage using any one of the modes measured and illustrates its application for beam structures. It becomes evident that the proposed strain energy based damage index also has potential for damage quantification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grassland management affects soil organic carbon (SOC) storage and can be used to mitigate greenhouse gas emissions. However, for a country to assess emission reductions due to grassland management, there must be an inventory method for estimating the change in SOC storage. The Intergovernmental Panel on Climate Change (IPCC) has developed a simple carbon accounting approach for this purpose, and here we derive new grassland management factors that represent the effect of changing management on carbon storage for this method. Our literature search identified 49 studies dealing with effects of management practices that either degraded or improved conditions relative to nominally managed grasslands. On average, degradation reduced SOC storage to 95% +/- 0.06 and 97% +/- 0.05 of carbon stored under nominal conditions in temperate and tropical regions, respectively. In contrast, improving grasslands with a single management activity enhanced SOC storage by 14% 0.06 and 17% +/- 0.05 in temperate and tropical regions, respectively, and with an additional improvement(s), storage increased by another 11% +/- 0.04. We applied the newly derived factor coefficients to analyze C sequestration potential for managed grasslands in the U.S., and found that over a 20-year period changing management could sequester from 5 to 142 Tg C yr(-1) or 0.1 to 0.9 Mg C ha(-1) yr(-1), depending on the level of change. This analysis provides revised factor coefficients for the IPCC method that can be used to estimate impacts of management; it also provides a methodological framework for countries to derive factor coefficients specific to conditions in their region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim. This paper is a report of a review conducted to identify (a) best practice in information transfer from the emergency department for multi-trauma patients; (b) conduits and barriers to information transfer in trauma care and related settings; and (c) interventions that have an impact on information communication at handover and beyond. Background. Information transfer is integral to effective trauma care, and communication breakdown results in important challenges to this. However, evidence of adequacy of structures and processes to ensure transfer of patient information through the acute phase of trauma care is limited. Data sources. Papers were sourced from a search of 12 online databases and scanning references from relevant papers for 1990–2009. Review methods. The review was conducted according to the University of York’s Centre for Reviews and Dissemination guidelines. Studies were included if they concerned issues that influenced information transfer for patients in healthcare settings. Results. Forty-five research papers, four literature reviews and one policy statement were found to be relevant to parts of the topic, but not all of it. The main issues emerging concerned the impact of communication breakdown in some form, and included communication issues within trauma team processes, lack of structure and clarity during handovers including missing, irrelevant and inaccurate information, distractions and poorly documented care. Conclusion. Many factors influence information transfer but are poorly identified in relation to trauma care. The measurement of information transfer, which is integral to patient handover, has not been the focus of research to date. Nonetheless, documented patient information is considered evidence of care and a resource that affects continuing care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional searching method for model-order selection in linear regression is a nested full-parameters-set searching procedure over the desired orders, which we call full-model order selection. On the other hand, a method for model-selection searches for the best sub-model within each order. In this paper, we propose using the model-selection searching method for model-order selection, which we call partial-model order selection. We show by simulations that the proposed searching method gives better accuracies than the traditional one, especially for low signal-to-noise ratios over a wide range of model-order selection criteria (both information theoretic based and bootstrap-based). Also, we show that for some models the performance of the bootstrap-based criterion improves significantly by using the proposed partial-model selection searching method. Index Terms— Model order estimation, model selection, information theoretic criteria, bootstrap 1. INTRODUCTION Several model-order selection criteria can be applied to find the optimal order. Some of the more commonly used information theoretic-based procedures include Akaike’s information criterion (AIC) [1], corrected Akaike (AICc) [2], minimum description length (MDL) [3], normalized maximum likelihood (NML) [4], Hannan-Quinn criterion (HQC) [5], conditional model-order estimation (CME) [6], and the efficient detection criterion (EDC) [7]. From a practical point of view, it is difficult to decide which model order selection criterion to use. Many of them perform reasonably well when the signal-to-noise ratio (SNR) is high. The discrepancies in their performance, however, become more evident when the SNR is low. In those situations, the performance of the given technique is not only determined by the model structure (say a polynomial trend versus a Fourier series) but, more importantly, by the relative values of the parameters within the model. This makes the comparison between the model-order selection algorithms difficult as within the same model with a given order one could find an example for which one of the methods performs favourably well or fails [6, 8]. Our aim is to improve the performance of the model order selection criteria in cases where the SNR is low by considering a model-selection searching procedure that takes into account not only the full-model order search but also a partial model order search within the given model order. Understandably, the improvement in the performance of the model order estimation is at the expense of additional computational complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many traffic situations require drivers to cross or merge into a stream having higher priority. Gap acceptance theory enables us to model such processes to analyse traffic operation. This discussion demonstrated that numerical search fine tuned by statistical analysis can be used to determine the most likely critical gap for a sample of drivers, based on their largest rejected gap and accepted gap. This method shares some common features with the Maximum Likelihood Estimation technique (Troutbeck 1992) but lends itself well to contemporary analysis tools such as spreadsheet and is particularly analytically transparent. This method is considered not to bias estimation of critical gap due to very small rejected gaps or very large rejected gaps. However, it requires a sufficiently large sample that there is reasonable representation of largest rejected gap/accepted gap pairs within a fairly narrow highest likelihood search band.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the delivery and portal imaging of one square-field and one conformal radiotherapy treatment was simulated using the Monte Carlo codes BEAMnrc and DOSXYZnrc. The treatment fields were delivered to a humanoid phantom from different angles by a 6 MV photon beam linear accelerator, with an amorphous-silicon electronic portal imaging device (a-Si EPID) used to provide images of the phantom generated by each field. The virtual phantom preparation code CTCombine was used to combine a computed-tomography-derived model of the irradiated phantom with a simple, rectilinear model of the a-Si EPID, at each beam angle used in the treatment. Comparison of the resulting experimental and simulated a-Si EPID images showed good agreement, within \[gamma](3%, 3 mm), indicating that this method may be useful in providing accurate Monte Carlo predictions of clinical a-Si EPID images, for use in the verification of complex radiotherapy treatments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses human factors issues of low cost railway level crossings in Australia. Several issues are discussed in this paper including safety at passive level railway crossings, human factors considerations associated with unavailability of a warning device, and a conceptual model for how safety could be compromised at railway level crossings following prolonged or frequent unavailability. The research plans to quantify safety risk to motorists at level crossings using a Human Reliability Assessment (HRA) method, supported by data collected using an advanced driving simulator. This method aims to identify human error within tasks and task units identified as part of the task analysis process. It is anticipated that by modelling driver behaviour the current study will be able to quantify meaningful task variability including temporal parameters, between participants and within participants. The process of complex tasks such as driving through a level crossing is fundamentally context-bound. Therefore this study also aims to quantify those performance-shaping factors that contribute to vehicle train collisions by highlighting changes in the task units and driver physiology. Finally we will also consider a number of variables germane to ensuring external validity of our results. Without this inclusion, such an analysis could seriously underestimate the probabilistic risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detecting query reformulations within a session by a Web searcher is an important area of research for designing more helpful searching systems and targeting content to particular users. Methods explored by other researchers include both qualitative (i.e., the use of human judges to manually analyze query patterns on usually small samples) and nondeterministic algorithms, typically using large amounts of training data to predict query modification during sessions. In this article, we explore three alternative methods for detection of session boundaries. All three methods are computationally straightforward and therefore easily implemented for detection of session changes. We examine 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005. We compare session analysis using (a) Internet Protocol address and cookie; (b) Internet Protocol address, cookie, and a temporal limit on intrasession interactions; and (c) Internet Protocol address, cookie, and query reformulation patterns. Overall, our analysis shows that defining sessions by query reformulation along with Internet Protocol address and cookie provides the best measure, resulting in an 82% increase in the count of sessions. Regardless of the method used, the mean session length was fewer than three queries, and the mean session duration was less than 30 min. Searchers most often modified their query by changing query terms (nearly 23% of all query modifications) rather than adding or deleting terms. Implications are that for measuring searching traffic, unique sessions may be a better indicator than the common metric of unique visitors. This research also sheds light on the more complex aspects of Web searching involving query modifications and may lead to advances in searching tools.