631 resultados para Hybrid methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designed for independent living, retirement villages provide either detached or semi-detached residential dwellings with car parking and small private yards. Retirement village developments usually include a mix of independent living units (ILUs) and serviced apartments (SAs) with community facilities providing a shared congregational area for village activities and socialising. Retirement Village assets differ from traditional residential assets due to their operation in accordance with statutory legislation. In Australia, each State and Territory has its own Retirement Village Act and Regulations. In essence, the village operator provides the land and buildings to the residents who pay an amount on entry for the right of occupation. On departure from the units an agreed proportion of either the original purchase price or the sale price is paid to the outgoing resident. The market value of the operator’s interest in the Retirement Village is therefore based upon the estimated future income from Deferred Management Fees and Capital Gain upon roll-over receivable by the operator in accordance with the respective residency agreements. Given the lumpiness of these payments, there is general acceptance that the most appropriate approach to valuation is through Discounted Cash Flow (DCF) analysis. There is however inconsistency between valuers across Australia in how they undertake their DCF analysis, leading to differences in reported values and subsequent confusion among users of valuation services. To give guidance to valuers and enhance confidence from users of valuation services this paper investigates the five major elements of discounted cash flow methodology, namely cash flows, escalation factors, holding period, terminal value and discount rate. Whilst there is dissatisfaction with the financial structuring of the DMF in residency agreements, as long as there are future financial returns receivable by the Village owner/operator, then DCF will continue to be the most appropriate valuation methodology for resident funded retirement villages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this review is to integrate and summarize specific measurement topics (instrument and metric choice, validity, reliability, how many and what types of days, reactivity, and data treatment) appropriate to the study of youth physical activity. Research quality pedometers are necessary to aid interpretation of steps per day collected in a range of young populations under a variety of circumstances. Steps per day is the most appropriate metric choice, but steps per minute can be used to interpret time-in-intensity in specifically delimited time periods (e.g., physical education class). Reported intraclass correlations (ICC) have ranged from .65 over 2 days (although higher values also have been reported for 2 days) to .87 over 8 days (although higher values have been reported for fewer days). Reported ICCs are lower on weekend days (.59) versus weekdays (.75) and lower over vacation days (.69) versus school days (.74). There is no objective evidence of reactivity at this time. Data treatment includes (a) identifying and addressing missing values, (b) identifying outliers and reducing data appropriately if necessary, and (c) transforming the data as required in preparation for inferential analysis. As more pedometry studies in young populations are published, these preliminary methodological recommendations should be modified and refined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The 2003 Bureau of Labor Statistics American Time Use Survey (ATUS) contains 438 distinct primary activity variables that can be analyzed with regard to how time is spent by Americans. The Compendium of Physical Activities is used to code physical activities derived from various surveys, logs, diaries, etc to facilitate comparison of coded intensity levels across studies. ------ ----- Methods: This paper describes the methods, challenges, and rationale for linking Compendium estimates of physical activity intensity (METs, metabolic equivalents) with all activities reported in the 2003 ATUS. ----- ----- Results: The assigned ATUS intensity levels are not intended to compute the energy costs of physical activity in individuals. Instead, they are intended to be used to identify time spent in activities broadly classified by type and intensity. This function will complement public health surveillance systems and aid in policy and health-promotion activities. For example, at least one of the future projects of this process is the descriptive epidemiology of time spent in common physical activity intensity categories. ----- ----- Conclusions: The process of metabolic coding of the ATUS by linking it with the Compendium of Physical Activities can make important contributions to our understanding of Americans’ time spent in health-related physical activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LiteSteel Beam (LSB) is a new cold-formed steel beam produced by OneSteel Australian Tube Mills (OATM). The new beam is effectively a channel section with two rectangular hollow flanges and a slender web, and is manufactured using patented dual electric resistance welding and automated roll-forming technologies. OATM is promoting the use of LSBs as flexural members in residential construction. When LSBs are used as back to back built-up sections, they are likely to improve their moment capacity. However, the research project conducted on the flexural behaviour of back to back built-up LSBs showed that the detrimental effects of lateral distortional buckling in single LSB members appear to remain with back to back built-up LSB members. The ultimate moment capacity of back to back LSB member is also affected by lateral distortional buckling failure. Therefore an investigation was conducted with an aim to develop suitable strength improvement methods, which are likely to mitigate lateral distortional buckling effects and hence improve the flexural strengths of back to back LSB members. This paper presents the details of this investigation, the results and recommendations for the most suitable and cost-effective method, which significantly improves the moment capacities of back to back LSB members.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been much conjecture of late as to whether the patentable subject matter standard contains a physicality requirement. The issue came to a head when the Federal Circuit introduced the machine-or-transformation test in In re Bilski and declared it to be the sole test for determining subject matter eligibility. Many commentators criticized the test, arguing that it is inconsistent with Supreme Court precedent and the need for the patent system to respond appropriately to all new and useful innovation in whatever form it arises. Those criticisms were vindicated when, on appeal, the Supreme Court in Bilski v. Kappos dispensed with any suggestion that the patentable subject matter test involves a physicality requirement. In this article, the issue is addressed from a normative perspective: it asks whether the patentable subject matter test should contain a physicality requirement. The conclusion reached is that it should not, because such a limitation is not an appropriate means of encouraging much of the valuable innovation we are likely to witness during the Information Age. It is contended that it is not only traditionally-recognized mechanical, chemical and industrial manufacturing processes that are patent eligible, but that patent eligibility extends to include non-machine implemented and non-physical methods that do not have any connection with a physical device and do not cause a physical transformation of matter. Concerns raised that there is a trend of overreaching commoditization or propertization, where the boundaries of patent law have been expanded too far, are unfounded since the strictures of novelty, nonobviousness and sufficiency of description will exclude undeserving subject matter from patentability. The argument made is that introducing a physicality requirement will have unintended adverse effects in various fields of technology, particularly those emerging technologies that are likely to have a profound social effect in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Longitudinal panel studies of large, random samples of business start-ups captured at the pre-operational stage allow researchers to address core issues for entrepreneurship research, namely, the processes of creation of new business ventures as well as their antecedents and outcomes. Here, we perform a methods-orientated review of all 83 journal articles that have used this type of data set, our purpose being to assist users of current data sets as well as designers of new projects in making the best use of this innovative research approach. Our review reveals a number of methods issues that are largely particular to this type of research. We conclude that amidst exemplary contributions, much of the reviewed research has not adequately managed these methods challenges, nor has it made use of the full potential of this new research approach. Specifically, we identify and suggest remedies for context-specific and interrelated methods challenges relating to sample definition, choice of level of analysis, operationalization and conceptualization, use of longitudinal data and dealing with various types of problematic heterogeneity. In addition, we note that future research can make further strides towards full utilization of the advantages of the research approach through better matching (from either direction) between theories and the phenomena captured in the data, and by addressing some under-explored research questions for which the approach may be particularly fruitful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, transport disadvantage has been identified using accessibility analysis although the effectiveness of the accessibility planning approach to improving access to goods and services is not known. This paper undertakes a comparative assessment of measures of mobility, accessibility, and participation used to identify transport disadvantage using the concept of activity spaces. A 7 day activity-travel diary data for 89 individuals was collected from two case study areas located in rural Northern Ireland. A spatial analysis was conducted to select the case study areas using criteria derived from the literature. The criteria are related to the levels of area accessibility and area mobility which are known to influence the nature of transport disadvantage. Using the activity-travel diary data individuals weekly as well as day to day variations in activity-travel patterns were visualised. A model was developed using the ArcGIS ModelBuilder tool and was run to derive scores related to individual levels of mobility, accessibility, and participation in activities from the geovisualisation. Using these scores a multiple regression analysis was conducted to identify patterns of transport disadvantage. This study found a positive association between mobility and accessibility, between mobility and participation, and between accessibility and participation in activities. However, area accessibility and area mobility were found to have little impact on individual mobility, accessibility, and participation in activities. Income vis-àvis ´ car-ownership was found to have a significant impact on individual levels of mobility, and accessibility; whereas participation in activities were found to be a function of individual levels of income and their occupational status.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a recent journal article, Luke Jaaniste and I identified an emergent model of exegesis. From a content analysis of submitted exegeses within a local archive, we identified an approach that is quite different from the traditional thesis, but is also distinct from previously identified forms of exegesis, which Milech and Schilo have described as a ‘context model’ (which assumes the voice of academic objectivity and provides an historical or theoretical context for the creative practice) and a ‘commentary’ model’ (which takes the form of a first person reflection on the challenges, insights and achievements of the practice). The model we identified combines these dichotomous forms and assumes a dual orientation–looking outwards to the established field of research, exemplars and theories, and inwards to the methodologies, processes and outcomes of the practice. We went on to argue that this ‘connective’ exegesis offers clear benefits to the researcher in connecting the practice to an established field while allowing the researcher to demonstrate how the methods have led to outcomes that advance the field in some way. And, while it helps the candidate to articulate objective claims for research innovation, it enables them to retain a voiced, personal relationship with their practice. However, it also poses considerable complexities and challenges in the writing. It requires a reconciliation of multi-perspectival subject positions: the disinterested perspective and academic objectivity of an observer/ethnographer/analyst/theorist at times and the invested perspective of the practitioner/ producer at others. The author must also contend with a range of writing styles, speech genres and voices: from the formal, polemical voice of the theorist to the personal, questioning and sometimes emotive voice of reflexivity. Moreover, the connective exegesis requires the researcher to synthesize various perspectives, subject positions, writing styles, and voices into a unified and coherent text. In this paper I consider strategies for writing a hybrid, connective exegesis. I first ground the discussion on polyvocality and alternate textual structures through reference to recent discussions in philosophy and critical theory, and point to examples of emergent approaches to texts and practices in related fields. I then return to the collection of archived exegeses to investigate the strategies that postgraduate candidates have adopted to resolve the problems that arise from a polyvocal, connective exegesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social tags in web 2.0 are becoming another important information source to describe the content of items as well as to profile users’ topic preferences. However, as arbitrary words given by users, tags contains a lot of noise such as tag synonym and semantic ambiguity a large number personal tags that only used by one user, which brings challenges to effectively use tags to make item recommendations. To solve these problems, this paper proposes to use a set of related tags along with their weights to represent semantic meaning of each tag for each user individually. A hybrid recommendation generation approaches that based on the weighted tags are proposed. We have conducted experiments using the real world dataset obtained from Amazon.com. The experimental results show that the proposed approaches outperform the other state of the art approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this feasibility study an organic plastic scintillator is calibrated against ionisation chamber measurements and then embedded in a polymer gel dosimeter to obtain a quasi-4D experimental measurement of a radiation field. This hybrid dosimeter was irradiated with a linear accelerator, with temporal measurements of the dose rate being acquired by the scintillator and spatial measurements acquired with the gel dosimeter. The detectors employed in this work are radiologically equivalent; and we show that neither detector perturbs the intensity of the radiation field of the other. By employing these detectors in concert, spatial and temporal variations in the radiation intensity can now be detected and gel dosimeters can be calibrated for absolute dose from a single irradiation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the analysis of medical images for computer-aided diagnosis and therapy, segmentation is often required as a preliminary step. Medical image segmentation is a complex and challenging task due to the complex nature of the images. The brain has a particularly complicated structure and its precise segmentation is very important for detecting tumors, edema, and necrotic tissues in order to prescribe appropriate therapy. Magnetic Resonance Imaging is an important diagnostic imaging technique utilized for early detection of abnormal changes in tissues and organs. It possesses good contrast resolution for different tissues and is, thus, preferred over Computerized Tomography for brain study. Therefore, the majority of research in medical image segmentation concerns MR images. As the core juncture of this research a set of MR images have been segmented using standard image segmentation techniques to isolate a brain tumor from the other regions of the brain. Subsequently the resultant images from the different segmentation techniques were compared with each other and analyzed by professional radiologists to find the segmentation technique which is the most accurate. Experimental results show that the Otsu’s thresholding method is the most suitable image segmentation method to segment a brain tumor from a Magnetic Resonance Image.