959 resultados para MARC format
Resumo:
Bangkok Metropolitan Region (BMR) is the centre for various major activities in Thailand including political, industry, agriculture, and commerce. Consequently, the BMR is the highest and most densely populated area in Thailand. Thus, the demand for houses in the BMR is also the largest, especially in subdivision developments. For these reasons, the subdivision development in the BMR has increased substantially in the past 20 years and generated large numbers of subdivision developments (AREA, 2009; Kridakorn Na Ayutthaya & Tochaiwat, 2010). However, this dramatic growth of subdivision development has caused several problems including unsustainable development, especially for subdivision neighbourhoods, in the BMR. There have been rating tools that encourage the sustainability of neighbourhood design in subdivision development, but they still have practical problems. Such rating tools do not cover the scale of the development entirely; and they concentrate more on the social and environmental conservation aspects, which have not been totally accepted by the developers (Boonprakub, 2011; Tongcumpou & Harvey, 1994). These factors strongly confirm the need for an appropriate rating tool for sustainable subdivision neighbourhood design in the BMR. To improve level of acceptance from all stakeholders in subdivision developments industry, the new rating tool should be developed based on an approach that unites the social, environmental, and economic approaches, such as eco-efficiency principle. Eco-efficiency is the sustainability indicator introduced by the World Business Council for Sustainable Development (WBCSD) since 1992. The eco-efficiency is defined as the ratio of the product or service value according to its environmental impact (Lehni & Pepper, 2000; Sorvari et al., 2009). Eco-efficiency indicator is concerned to the business, while simultaneously, is concerned with to social and the environment impact. This study aims to develop a new rating tool named "Rating for sustainable subdivision neighbourhood design (RSSND)". The RSSND methodology is developed by a combination of literature reviews, field surveys, the eco-efficiency model development, trial-and-error technique, and the tool validation process. All required data has been collected by the field surveys from July to November 2010. The ecoefficiency model is a combination of three different mathematical models; the neighbourhood property price (NPP) model, the neighbourhood development cost (NDC) model, and the neighbourhood occupancy cost (NOC) model which are attributable to the neighbourhood subdivision design. The NPP model is formulated by hedonic price model approach, while the NDC model and NOC model are formulated by the multiple regression analysis approach. The trial-and-error technique is adopted for simplifying the complex mathematic eco-efficiency model to a user-friendly rating tool format. Credibility of the RSSND has been validated by using both rated and non-rated of eight subdivisions. It is expected to meet the requirements of all stakeholders which support the social activities of the residents, maintain the environmental condition of the development and surrounding areas, and meet the economic requirements of the developers.
Resumo:
The 2 hour game jam was performed as part of the State Library of Queensland 'Garage Gamer' series of events, summer 2013, at the SLQ exhibition. An aspect of the exhibition was the series of 'Level Up' game nights. We hosted the first of these - under the auspices of brIGDA, Game On. It was a party - but the focal point of the event was a live streamed 2 hour game jam. Game jams have become popular amongst the game development and design community in recent years, particularly with the growth of the Global Game Jam, a yearly event which brings thousands of game makers together across different sites in different countries. Other established jams take place on-line, for example the Ludum Dare challenge which as been running since 2002. Other challenges follow the same model in more intimate circumstances and it is now common to find institutions and groups holding their own small local game making jams. There are variations around the format, some jams are more competitive than others for example, but a common aspect is the creation of an intense creative crucible centred around team work and ‘accelerated game development’. Works (games) produced during these intense events often display more experimental qualities than those undertaken as commercial projects. In part this is because the typical jam is started with a conceptual design brief, perhaps a single word, or in the case of the specific game jam described in this paper, three words. Teams have to envision the challenge key word/s as a game design using whatever skills and technologies they can and produce a finished working game in the time given. Game jams thus provide design researchers with extraordinary fodder and recent years have also seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). The 2 hour game jam held during the SLQ Garage Gamer program was all about social experience.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestion. Hence, reducing the frequency of crashes assist in addressing congestion issues (Meyer, 2008). Analysing traffic conditions and discovering risky traffic trends and patterns are essential basics in crash likelihood estimations studies and still require more attention and investigation. In this paper we will show, through data mining techniques, that there is a relationship between pre-crash traffic flow patterns and crash occurrence on motorways, compare them with normal traffic trends, and that this knowledge has the potentiality to improve the accuracy of existing crash likelihood estimation models, and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash occurrence. K-Means clustering algorithm applied to determine dominant pre-crash traffic patterns. In the first phase of this research, traffic regimes identified by analysing crashes and normal traffic situations using half an hour speed in upstream locations of crashes. Then, the second phase investigated the different combination of speed risk indicators to distinguish crashes from normal traffic situations more precisely. Five major trends have been found in the first phase of this paper for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Moreover, the second phase explains that spatiotemporal difference of speed is a better risk indicator among different combinations of speed related risk indicators. Based on these findings, crash likelihood estimation models can be fine-tuned to increase accuracy of estimations and minimize false alarms.
Resumo:
Israeli Organised Crime (IOC) gained prominence in the 1990s for its involvement in the manufacturing and wholesale distribution of MDMA through traditional trafficking networks across Europe. Equipped with astute business acumen and an entrepreneurial spirit, IOC dominated MDMA trafficking in Europe for more than a decade and remains as a major participant in this drug market. The paper analyses the entrepreneurial activities of IOC within the context of the MDMA market in Europe between 1990 and 2005 using the Crime Business Analysis Matrix (CBAM) as proffered by Dean, et al (2010). The study is in two parts. Part A provides a review of the literature as it pertains to IOC and its involvement in the European drug market, while Part B provides a qualitative analysis of their criminal business practices and entrepreneurialism of IOC within this context.
Resumo:
This paper presents a model for the generation of a MAC tag using a stream cipher. The input message is used indirectly to control segments of the keystream that form the MAC tag. Several recent proposals can be considered as instances of this general model, as they all perform message accumulation in this way. However, they use slightly different processes in the message preparation and finalisation phases. We examine the security of this model for different options and against different types of attack, and conclude that the indirect injection model can be used to generate MAC tags securely for certain combinations of options. Careful consideration is required at the design stage to avoid combinations of options that result in susceptibility to forgery attacks. Additionally, some implementations may be vulnerable to side-channel attacks if used in Authenticated Encryption (AE) algorithms. We give design recommendations to provide resistance to these attacks for proposals following this model.
Resumo:
Non-linear feedback shift register (NLFSR) ciphers are cryptographic tools of choice of the industry especially for mobile communication. Their attractive feature is a high efficiency when implemented in hardware or software. However, the main problem of NLFSR ciphers is that their security is still not well investigated. The paper makes a progress in the study of the security of NLFSR ciphers. In particular, we show a distinguishing attack on linearly filtered NLFSR (or LF-NLFSR) ciphers. We extend the attack to a linear combination of LF-NLFSRs. We investigate the security of a modified version of the Grain stream cipher and show its vulnerability to both key recovery and distinguishing attacks.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the algebraic attack approach of Berbain et al. to Trivium-like ciphers and perform new analyses on them. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analyses. We also answer an open question posed by Berbain et al. on the feasibility of applying their technique on Trivium-like ciphers. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail. Analysis of Bivium-B and Trivium-N are omitted from this manuscript. The full paper is available on the IACR ePrint Archive.
Resumo:
The goals of this project were to determine the education and training needs of health consumers and the relevant health workforce and to identify and map the available education and training activities and resources. The methods used to collect the data included online surveys and one on one interviews of relevant patients and their carers. The project manager actively sought to engage with the key wound management leaders and advanced clinicians to gain their support and views on the priority education and training issues. The response to all data collection methods was pleasing with almost five hundred responses to the general wound workforce online survey. The data supported the need for more wound management education and training and identified some particular topics of need, such as utilising wound investigations and understanding wound products, pharmaceuticals and devices. The occupational groups with the highest need appear to be those working in primary health care, such as practice nurses and GPs, and those working in residential aged care facilities. The education and training stocktake identified a wide range of activities currently available, the majority being provided in a face to face format. The next stage of the project will be to form some clear and achievable priority action areas based on the available data. An online directory of wound management education and training activities and resources will be developed and further development will be undertaken on a knowledge and skills framework for the wound management workforce. Additionally, transfer of learning factors in the general practice environment will be assessed and strategies will be developed to improve the pre-entry or undergraduate wound management training within relevant higher education programs.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Purpose: This randomized, multicenter trial compared first-line trastuzumab plus docetaxel versus docetaxel alone in patients with human epidermal growth factor receptor 2 (HER2)-positive metastatic breast cancer (MBC). Patients and Methods: Patients were randomly assigned to six cycles of docetaxel 100 mg/m 2 every 3 weeks, with or without trastuzumab 4 mg/kg loading dose followed by 2 mg/kg weekly until disease progression. Results: A total of 186 patients received at least one dose of the study drug. Trastuzumab plus docetaxel was significantly superior to docetaxel alone in terms of overall response rate (61% v 34%; P = .0002), overall survival (median, 31.2 v 22.7 months; P = .0325), time to disease progression (median, 11.7 v 6.1 months; P = .0001), time to treatment failure (median, 9.8 v 5.3 months; P = .0001), and duration of response (median, 11.7 v 5.7 months; P = .009). There was little difference in the number and severity of adverse events between the arms. Grade 3 to 4 neutropenia was seen more commonly with the combination (32%) than with docetaxel alone (22%), and there was a slightly higher incidence of febrile neutropenia in the combination arm (23% v 17%). One patient in the combination arm experienced symptomatic heart failure (1%). Another patient experienced symptomatic heart failure 5 months after discontinuation of trastuzumab because of disease progression, while being treated with an investigational anthracycline for 4 months. Conclusion: Trastuzumab combined with docetaxel is superior to docetaxel alone as first-line treatment of patients with HER2-positive MBC in terms of overall survival, response rate, response duration, time to progression, and time to treatment failure, with little additional toxicity. © 2005 by American Society of Clinical Oncology.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Digital human modeling (DHM) systems underwent significant development within the last years. They achieved constantly growing importance in the field of ergonomic workplace design, product development, product usability, ergonomic research, ergonomic education, audiovisual marketing and the entertainment industry. They help to design ergonomic products as well as healthy and safe socio-technical work systems. In the domain of scientific DHM systems, no industry specific standard interfaces are defined which could facilitate the exchange of 3D solid body data, anthropometric data or motion data. The focus of this article is to provide an overview of requirements for a reliable data exchange between different DHM systems in order to identify suitable file formats. Examples from the literature are discussed in detail. Methods: As a first step a literature review is conducted on existing studies and file formats for exchanging data between different DHM systems. The found file formats can be structured into different categories: static 3D solid body data exchange, anthropometric data exchange, motion data exchange and comprehensive data exchange. Each file format is discussed and advantages as well as disadvantages for the DHM context are pointed out. Case studies are furthermore presented, which show first approaches to exchange data between DHM systems. Lessons learnt are shortly summarized. Results: A selection of suitable file formats for data exchange between DHM systems is determined from the literature review.
Resumo:
The application of the Bluetooth (BT) technology to transportation has been enabling researchers to make accurate travel time observations, in freeway and arterial roads. The Bluetooth traffic data are generally incomplete, for they only relate to those vehicles that are equipped with Bluetooth devices, and that are detected by the Bluetooth sensors of the road network. The fraction of detected vehicles versus the total number of transiting vehicles is often referred to as Bluetooth Penetration Rate (BTPR). The aim of this study is to precisely define the spatio-temporal relationship between the quantities that become available through the partial, noisy BT observations; and the hidden variables that describe the actual dynamics of vehicular traffic. To do so, we propose to incorporate a multi- class traffic model into a Sequential Montecarlo Estimation algorithm. Our framework has been applied for the empirical travel time investigations into the Brisbane Metropolitan region.
Resumo:
The study of the relationship between macroscopic traffic parameters, such as flow, speed and travel time, is essential to the understanding of the behaviour of freeway and arterial roads. However, the temporal dynamics of these parameters are difficult to model, especially for arterial roads, where the process of traffic change is driven by a variety of variables. The introduction of the Bluetooth technology into the transportation area has proven exceptionally useful for monitoring vehicular traffic, as it allows reliable estimation of travel times and traffic demands. In this work, we propose an approach based on Bayesian networks for analyzing and predicting the complex dynamics of flow or volume, based on travel time observations from Bluetooth sensors. The spatio-temporal relationship between volume and travel time is captured through a first-order transition model, and a univariate Gaussian sensor model. The two models are trained and tested on travel time and volume data, from an arterial link, collected over a period of six days. To reduce the computational costs of the inference tasks, volume is converted into a discrete variable. The discretization process is carried out through a Self-Organizing Map. Preliminary results show that a simple Bayesian network can effectively estimate and predict the complex temporal dynamics of arterial volumes from the travel time data. Not only is the model well suited to produce posterior distributions over single past, current and future states; but it also allows computing the estimations of joint distributions, over sequences of states. Furthermore, the Bayesian network can achieve excellent prediction, even when the stream of travel time observation is partially incomplete.
Resumo:
A test of the useful field of view was introduced more than two decades ago and was designed to reflect the visual difficulties that older adults experience with everyday tasks. Importantly, the useful field of view is one of the most extensively researched and promising predictor tests for a range of driving outcomes measures, including driving ability and crash risk, as well as other everyday tasks. Currently available commercial versions of the test can be administered using personal computers and measure speed of visual processing speed for rapid detection and localization of targets under conditions of divided visual attention and in the presence and absence of visual clutter. The test is believed to assess higher order cognitive abilities, but performance also relies on visual sensory function since targets must be visible in order to be attended to. The format of the useful field of view test has been modified over the years; the original version estimated the spatial extent of useful field of view, while the latest versions measures visual processing speed. While deficits in the useful field of view are associated with functional impairments in everyday activities in older adults, there is also emerging evidence from several research groups that improvements in visual processing speed can be achieved through training. These improvements have been shown to reduce crash risk, and have a positive impact on health and functional well being, with the potential to increase the mobility and hence independence of older adults.