502 resultados para New Science
Resumo:
The traditional residential development process uses pre-sales to manage risk and lock in demand so that development financiers can be kept happy. However, obtaining the requisite level of presales is an expensive business, a cost that is ultimately borne by the new home buyer. With housing affordability at the front of everyone’s mind, we ask: is there a better way? How can housing be supplied more innovatively? A research collaboration between QUT, Swinburne Social Research Unit and the Office of the Victorian Government has been investigating this very issue.
The dual nature of information systems in enabling a new wave of hardware ventures: Towards a theory
Resumo:
Hardware ventures are emerging entrepreneurial firms that create new market offerings based on development of digital devices. These ventures are important elements in the global economy but have not yet received much attention in the literature. Our interest in examining hardware ventures is specifically in the role that information system (IS) resources play in enabling them. We ask how the role of IS resources for hardware ventures can be conceptualized and develop a framework for assessment. Our framework builds on the distinction of operand and operant resources and distinguishes between two key lifecycle stages of hardware ventures: start-up and growth. We show how this framework can be used to discuss the role, nature, and use of IS for hardware ventures and outline empirical research strategies that flow from it. Our work contributes to broadening and enriching the IS field by drawing attention to its role in significant and novel phenomena.
Resumo:
Although kimberlite pipes/bodies are usually the remains of volcanic vents, in-vent deposits, and subvolcanic intrusions, the terminology used for kimberlite rocks has largely developed independently of that used in mainstream volcanology. Existing kimberlite terminology is not descriptive and includes terms that are rarely used, used differently, and even not used at all in mainstream volcanology. In addition, kimberlite bodies are altered to varying degrees, making application of genetic terminology difficult because original components and depositional textures are commonly masked by alteration. This paper recommends an approach to the terminology for kimberlite rocks that is consistent with usage for other volcanic successions. In modern terrains the eruption and emplacement origins of deposits can often be readily deduced, but this is often not the case for old, variably altered and deformed rock successions. A staged approach is required whereby descriptive terminology is developed first, followed by application of genetic terminology once all features, including the effects of alteration on original texture and depositional features, together with contact relationships and setting, have been evaluated. Because many volcanic successions consist of both primary volcanic deposits as well as volcanic sediments, terminology must account for both possibilities.
Resumo:
Public buildings and large infrastructure are typically monitored by tens or hundreds of cameras, all capturing different physical spaces and observing different types of interactions and behaviours. However to date, in large part due to limited data availability, crowd monitoring and operational surveillance research has focused on single camera scenarios which are not representative of real-world applications. In this paper we present a new, publicly available database for large scale crowd surveillance. Footage from 12 cameras for a full work day covering the main floor of a busy university campus building, including an internal and external foyer, elevator foyers, and the main external approach are provided; alongside annotation for crowd counting (single or multi-camera) and pedestrian flow analysis for 10 and 6 sites respectively. We describe how this large dataset can be used to perform distributed monitoring of building utilisation, and demonstrate the potential of this dataset to understand and learn the relationship between different areas of a building.
Resumo:
The phase transition of single layer molybdenum disulfide (MoS2) from semiconducting 2H to metallic 1T and then to 1T′ phases, and the effect of the phase transition on hydrogen evolution reaction (HER) are investigated within this work by density functional theory. Experimentally, 2H-MoS2 has been widely used as an excellent electrode for HER and can get charged easily. Here we find that the negative charge has a significant impact on the structural phase transition in a MoS2 monolayer. The thermodynamic stability of 1T-MoS2 increases with the negative charge state, comparing with the 2H-MoS2 structure before phase transition and the kinetic energy barrier for a phase transition from 2H to 1T decreases from 1.59 to 0.27 eV when 4e– are injected per MoS2 unit. Additionally, 1T phase is found to transform into the distorted structure (1T′ phase) spontaneously. On their activity toward hydrogen evolution reaction, 1T′-MoS2 structure shows comparable hydrogen evolution reaction activity to the 2H-MoS2 structure. If the charge transfer kinetics is taken into account, the catalytic activity of 1T′-MoS2 is superior to that of 2H-MoS2. Our finding provides a possible novel method for phase transition of MoS2 and enriches understanding of the catalytic properties of MoS2 for HER.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.
Resumo:
Mode indicator functions (MIFs) are used in modal testing and analysis as a means of identifying modes of vibration, often as a precursor to modal parameter estimation. Various methods have been developed since the MIF was introduced four decades ago. These methods are quite useful in assisting the analyst to identify genuine modes and, in the case of the complex mode indicator function, have even been developed into modal parameter estimation techniques. Although the various MIFs are able to indicate the existence of a mode, they do not provide the analyst with any descriptive information about the mode. This paper uses the simple summation type of MIF to develop five averaged and normalised MIFs that will provide the analyst with enough information to identify whether a mode is longitudinal, vertical, lateral or torsional. The first three functions, termed directional MIFs, have been noted in the literature in one form or another; however, this paper introduces a new twist on the MIF by introducing two MIFs, termed torsional MIFs, that can be used by the analyst to identify torsional modes and, moreover, can assist in determining whether the mode is of a pure torsion or sway type (i.e., having a rigid cross-section) or a distorted twisting type. The directional and torsional MIFs are tested on a finite element model based simulation of an experimental modal test using an impact hammer. Results indicate that the directional and torsional MIFs are indeed useful in assisting the analyst to identify whether a mode is longitudinal, vertical, lateral, sway, or torsion.
Resumo:
Field emission (FE) electron gun sources provide new capabilities for high lateral resolution EPMA. The determination of analytical lateral resolution is not as straightforward as that for electron microscopy imaging. Results from two sets of experiments to determine the actual lateral resolution for accurate EPMA are presented for Kα X-ray lines of Si and Al and Lα of Fe at 5 and 7 keV in a silicate glass. These results are compared to theoretical predictions and Monte Carlo simulations of analytical lateral resolution. The experiments suggest little is gained in lateral resolution by dropping from 7 to 5 keV in EPMA of this silicate glass.
Resumo:
The provision of effective training of supervisors and operators is essential if sugar factories are to operate profitably and in an environmentally sustainable and safe manner. The benefits of having supervisor and operator staff with a high level of operational skills are reduced stoppages, increased recovery, improved sugar quality, reduced damage to equipment, and reduced OH&S and environmental impacts. Training of new operators and supervisors in factories has traditionally relied on on-the-job training of the new or inexperienced staff by experienced supervisors and operators, supplemented by courses conducted by contractors such as Sugar Research Institute (SRI). However there is clearly a need for staff to be able to undertake training at any time, drawing on the content of online courses as required. An improved methodology for the training of factory supervisors and operators has been developed by QUT on behalf of a syndicate of mills. The new methodology provides ‘at factory’ learning via self-paced modules. Importantly, the training resources for each module are designed to support the training programs within sugar factories, thereby establishing a benchmark for training across the sugar industry. The modules include notes, training guides and session plans, guidelines for walkthrough tours of the stations, learning activities, resources such as videos, animations, job aids and competency assessments. The materials are available on the web for registered users in Australian Mills and many activities are best undertaken online. Apart from a few interactive online resources, the materials for each module can also be downloaded. The acronym SOTrain (Supervisor and Operator Training) has been applied to the new training program.
Resumo:
Cognitive scientists were not quick to embrace the functional neuroimaging technologies that emerged during the late 20th century. In this new century, cognitive scientists continue to question, not unreasonably, the relevance of functional neuroimaging investigations that fail to address questions of interest to cognitive science. However, some ultra-cognitive scientists assert that these experiments can never be of relevance to the study of cognition. Their reasoning reflects an adherence to a functionalist philosophy that arbitrarily and purposefully distinguishes mental information-processing systems from brain or brain-like operations. This article addresses whether data from properly conducted functional neuroimaging studies can inform and subsequently constrain the assumptions of theoretical cognitive models. The article commences with a focus upon the functionalist philosophy espoused by the ultra-cognitive scientists, contrasting it with the materialist philosophy that motivates both cognitive neuroimaging investigations and connectionist modelling of cognitive systems. Connectionism and cognitive neuroimaging share many features, including an emphasis on unified cognitive and neural models of systems that combine localist and distributed representations. The utility of designing cognitive neuroimaging studies to test (primarily) connectionist models of cognitive phenomena is illustrated using data from functional magnetic resonance imaging (fMRI) investigations of language production and episodic memory.
Resumo:
The size and arrangement of stromal collagen fibrils (CFs) influence the optical properties of the cornea and hence its function. The spatial arrangement of the collagen is still questionable in relation to the diameter of collagen fibril. In the present study, we introduce a new parameter, edge-fibrillar distance (EFD) to measure how two collagen fibrils are spaced with respect to their closest edges and their spatial distribution through normalized standard deviation of EFD (NSDEFD) accessed through the application of two commercially available multipurpose solutions (MPS): ReNu and Hippia. The corneal buttons were soaked separately in ReNu and Hippia MPS for five hours, fixed overnight in 2.5% glutaraldehyde containing cuprolinic blue and processed for transmission electron microscopy. The electron micrographs were processed using ImageJ user-coded plugin. Statistical analysis was performed to compare the image processed equivalent diameter (ED), inter-fibrillar distance (IFD), and EFD of the CFs of treated versus normal corneas. The ReNu-soaked cornea resulted in partly degenerated epithelium with loose hemidesmosomes and Bowman’s collagen. In contrast, the epithelium of the cornea soaked in Hippia was degenerated or lost but showed closely packed Bowman’s collagen. Soaking the corneas in both MPS caused a statistically significant decrease in the anterior collagen fibril, ED and a significant change in IFD, and EFD than those of the untreated corneas (p < 0.05, for all comparisons). The introduction of EFD measurement in the study directly provided a sense of gap between periphery of the collagen bundles, their spatial distribution; and in combination with ED, they showed how the corneal collagen bundles are spaced in relation to their diameters. The spatial distribution parameter NSDEFD indicated that ReNu treated cornea fibrils were uniformly distributed spatially, followed by normal and Hippia. The EFD measurement with relatively lower standard deviation and NSDEFD, a characteristic of uniform CFs distribution, can be an additional parameter used in evaluating collagen organization and accessing the effects of various treatments on corneal health and transparency.
Resumo:
Montserrat now provides one of the most complete datasets for understanding the character and tempo of hazardous events at volcanic islands. Much of the erupted material ends up offshore, and this offshore record may be easier to date due to intervening hemiplegic sediments between event beds. The offshore dataset includes the first scientific drilling of volcanic island landslides during IODP Expedition 340, together with an unusually comprehensive set of shallow sediment cores and 2-D and 3-D seismic surveys. Most recently in 2013, Remotely Operated Vehicle (ROV) dives mapped and sampled the surface of the main landslide deposits. This contribution aims to provide an overview of key insights from ongoing work on IODP Expedition 340 Sites offshore Montserrat.Key objectives are to understand the composition (and hence source), emplacement mechanism (and hence tsunami generation) of major landslides, together with their frequency and timing relative to volcanic eruption cycles. The most recent major collapse event is Deposit 1, which involved ~1.8 km cubed of material and produced a blocky deposit at ~12-14ka. Deposit 1 appears to have involved not only the volcanic edifice, but also a substantial component of a fringing bioclastic shelf, and material locally incorporated from the underlying seafloor. This information allows us to test how first-order landslide morphology (e.g. blocky or elongate lobes) is related to first-order landslide composition. Preliminary analysis suggests that Deposit 1 occurred shortly before a second major landslide on the SW of the island (Deposit 5). It may have initiated English's Crater, but was not associated with a major change in magma composition. An associated turbidite-stack suggests it was emplaced in multiple stages, separated by at least a few hours and thus reducing the tsunami magnitude. The ROV dives show that mega-blocks in detail comprise smaller-scale breccias, which can travel significant distances without complete disintegration. Landslide Deposit 2 was emplaced at ~130ka, and is more voluminous (~8.4km cubed). It had a much more profound influence on the magmatic system, as it was linked to a major explosive mafic eruption and formation of a new volcanic centre (South Soufriere Hills) on the island. Site U1395 confirms a hypothesis based on the site survey seismic data that Deposit 2 includes a substantial component of pre-existing seafloor sediment. However, surprisingly, this pre-existing seafloor sediment in the lower part of Deposit 2 at Site U1395 is completely undeformed and flat lying, suggesting that Site U1395 penetrated a flat lying block. Work to date material from the upper part of U1396, U1395 and U1394 will also be summarised. This work is establishing a chronostratigraphy of major events over the last 1 Ma, with particularly detailed constraints during the last ~250ka. This is helping us to understand whether major landslides are related to cycles of volcanic eruptions.