113 resultados para JURISPRUDENTIAL SUB-RULE
Resumo:
The issue of whether Real Estate Investment Trusts (REITs) should pursue a focused or diversified investment strategy remains an ongoing debate within both the academic and industry communities. This article considers the relationship between REITs focused on different property sectors in a Generalized Autoregressive Conditional Heteroscedasticity-Dynamic Control Correlation (GARCH-DCC) framework. The daily conditional correlations reveal that since 1990 there has been a marked upward trend in the coefficients between US REIT sub-sectors. The findings imply that REITs are behaving in a far more homogeneous manner than in the past. Furthermore, the argument that REITs should be focused in order that investors can make the diversification decision is reduced.
Resumo:
Rensch’s rule, which states that the magnitude of sexual size dimorphism tends to increase with increasing body size, has evolved independently in three lineages of large herbivorous mammals: bovids (antelopes), cervids (deer), and macropodids (kangaroos). This pattern can be explained by a model that combines allometry,life-history theory, and energetics. The key features are thatfemale group size increases with increasing body size and that males have evolved under sexual selection to grow large enough to control these groups of females. The model predicts relationships among body size and female group size, male and female age at first breeding,death and growth rates, and energy allocation of males to produce body mass and weapons. Model predictions are well supported by data for these megaherbivores. The model suggests hypotheses for why some other sexually dimorphic taxa, such as primates and pinnipeds(seals and sea lions), do or do not conform to Rensh’s rule.
Resumo:
In a world where massive amounts of data are recorded on a large scale we need data mining technologies to gain knowledge from the data in a reasonable time. The Top Down Induction of Decision Trees (TDIDT) algorithm is a very widely used technology to predict the classification of newly recorded data. However alternative technologies have been derived that often produce better rules but do not scale well on large datasets. Such an alternative to TDIDT is the PrismTCS algorithm. PrismTCS performs particularly well on noisy data but does not scale well on large datasets. In this paper we introduce Prism and investigate its scaling behaviour. We describe how we improved the scalability of the serial version of Prism and investigate its limitations. We then describe our work to overcome these limitations by developing a framework to parallelise algorithms of the Prism family and similar algorithms. We also present the scale up results of a first prototype implementation.
Resumo:
In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.
Resumo:
Induction of classification rules is one of the most important technologies in data mining. Most of the work in this field has concentrated on the Top Down Induction of Decision Trees (TDIDT) approach. However, alternative approaches have been developed such as the Prism algorithm for inducing modular rules. Prism often produces qualitatively better rules than TDIDT but suffers from higher computational requirements. We investigate approaches that have been developed to minimize the computational requirements of TDIDT, in order to find analogous approaches that could reduce the computational requirements of Prism.
Resumo:
The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
Pulsed terahertz imaging is being developed as a technique to image obscured mural paintings. Due to significant advances in terahertz technology, portable systems are now capable of operating in unregulated environments and this has prompted their use on archaeological excavations. August 2011 saw the first use of pulsed terahertz imaging at the archaeological site of Çatalhöyük, Turkey, where mural paintings dating from the Neolithic period are continuously being uncovered by archaeologists. In these particular paintings the paint is applied onto an uneven surface, and then covered by an equally uneven surface. Traditional terahertz data analysis has proven unsuccessful at sub-surface imaging of these paintings due to the effect of these uneven surfaces. For the first time, an image processing technique is presented, based around Gaussian beam-mode coupling, which enables the visualization of the obscured painting.
Resumo:
Many institutions across sub-Saharan Africa (SSA) and many funding agencies that support them are currently engaged in initiatives that are targeted towards adapting rainfed agriculture to climate change. This does, however, present some very real and complex research and policy challenges. Given to date the generally low impact of agricultural research across SSA on improving the welfare of rainfed farmers under current climatic conditions, a comprehensive strategy is required if the considerably more complex challenge of adapting agriculture to future climate change is to bear fruit. In articulating such a strategy, it is useful to consider the criteria by which current successful initiatives should be judged.
Resumo:
The objective of this work was to evaluate the feasibility of simulating maize yield in a sub‑tropical region of southern Brazil using the general large area model (Glam). A 16‑year time series of daily weather data were used. The model was adjusted and tested as an alternative for simulating maize yield at small and large spatial scales. Simulated and observed grain yields were highly correlated (r above 0.8; p<0.01) at large scales (greater than 100,000 km2), with variable and mostly lower correlations (r from 0.65 to 0.87; p<0.1) at small spatial scales (lower than 10,000 km2). Large area models can contribute to monitoring or forecasting regional patterns of variability in maize production in the region, providing a basis for agricultural decision making, and Glam‑Maize is one of the alternatives.
Resumo:
Building assessment methods have become a popular research field since the early 1990s. An international tool which allows the assessment of buildings in all regions, taking into account differences in climates, topographies and cultures does not yet exist. This paper aims to demonstrate the importance of criteria and sub-criteria in developing a new potential building assessment method for Saudi Arabia. Recently, the awareness of sustainability has been increasing in developing countries due to high energy consumption, pollution and high carbon foot print. There is no debate that assessment criteria have an important role to identify the tool’s orientation. However, various aspects influence the criteria and sub-criteria of assessment tools such as environment, economic, social and cultural to mention but a few. The author provides an investigation on the most popular and globally used schemes: BREEAM, LEED, Green Star, CASBEE and Estidama in order to identify the effectiveness of the different aspects of the assessment criteria and the impacts of these criteria on the assessment results; that will provide a solid foundation to develop an effective sustainable assessment method for buildings in Saudi Arabia. Initial results of the investigation suggest that each country needs to develop its own assessment method in order to achieve desired results, while focusing upon the indigenous environmental, economic, social and cultural conditions. Keywords: Assessment methods, BREEAM, LEED, Green Star, CASBEE, Estidama, sustainability, sustainable buildings, Environment, Saudi Arabia.