341 resultados para Compile
Resumo:
In this volume we compile and comment on a collection of some of the most important works on nascent entrepreneurship that have appeared in the last two decades. We do not go further back than that because up until1992 hardly any systematic research on the pre-operational stage of business creation was undertaken. In that year, the terms 'nascent entrepreneur' and 'nascent venture' appear for the first time in the research literature (Reynolds and Miller, Chapter l, 1992; Reynolds and White, 1992). This signals the emergence of a new research paradigm designed to study• business creation processes empirically at very early stages, before an operational firm has come into existence. The most central feature of this type of research is that it identifies a statistically representative sample of nascent entrepreneurs (NEs)- people engaged in ongoing but not yet operational business start-ups- via screening interviews with a very large random sample of adults. The overarching research questions pursued in this emerging research tradition are the following: 1. What proportion of individuals (in various population subgroups) are trying to start a new business at any given time? 2. What led them to engage in the creation of a new business? 3. What characteristics and behaviors associated with the founder(s), the venture, the environment and the process are associated with persistence, progress and success in trying to start a new business?
Resumo:
Bioceramics play an important role in repairing and regenerating bone defects. Annually, more than 500,000 bone graft procedures are performed in the United states and approximately 2.2 million are conducted worldwide. The estimated cost of these procedures approaches $2.5billion per year. Around 60% of the bone graft substitutes available on the market involve bioceramics. It is reported that bioceramics in the world market increase by 9% per year. For this reason, the research of bioceramics has been one of the most active areas during, the past several years. Considering the significant importance of bioceramics, our goal was to compile this book to review the latest research advances in the field of bioceramics. The text also summarizes our work during the past 10 years in an effort to share innovative concepts, design of bioceramisc, and methods for material synthesis and drug delivery. We anticipate that this text will provide some useful information and guidance in the bioceramics field for biomedical engineering researchers and material scientists. Information on novel mesoporous bioactive glasses and silicate-based ceramics for bone regeneration and drug delivery are presented. Mesoporous bioactive glasses have shown multifunctional characteristics of bone regeneration and drug delivery due to their special mesopore structures,whereas silicated-based bioceramics, as typical third-generation biomaterials,possess significant osteostimulation properties. Silica nanospheres with a core-shell structure and specific properties for controllable drug delivery have been carefully reviewed-a variety of advanced synthetic strategies have been developed to construct functional mesoporous silica nanoparticles with a core-shell structure, including hollow, magnetic, or luminescent, and other multifunctional core-shell mesoporous silica nanoparticles. In addition, multifunctional drug delivery systems based on these nanoparticles have been designed and optimized to deliver the drugs into the targeted organs or cells,with a controllable release fashioned by virtue of various internal and external triggers. The novel 3D-printing technique to prepare advanced bioceramic scaffolds for bone tissue engineering applications has been highlighted, including the preparation, mechanical strength, and biological properties of 3D-printed porous scaffolds of calcium phosphate cement and silicate bioceramics. Three-dimensional printing techniques offer improved large-pore structure and mechanical strength. In addition , biomimetic preparation and controllable crystal growth as well as biomineralization of bioceramics are summarized, showing the latest research progress in this area. Finally, inorganic and organic composite materials are reviewed for bone regeneration and gene delivery. Bioactive inorganic and organic composite materials offer unique biological, electrical, and mechanical properties for designing excellent bone regeneration or gene delivery systems. It is our sincere hope that this book will updated the reader as to the research progress of bioceramics and their applications in bone repair and regeneration. It will be the best reward to all the contributors of this book if their efforts herein in some way help reader in any part of their study, research, and career development.
Resumo:
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.
Resumo:
It is rare to find an anthology that realizes the possibilities of the form. We tend to regard our edited collections as lesser siblings, and forget their special value. But at times, a subject seems to require an edited collection much more than it does a classic monograph. So it is with the subject showcased here, which concerns the global circulation, performance and consumption of heavy metal. This is a relatively new and emerging body of work, hitherto scattered disparately in the broader popular music studies, but quickly gaining status as a “studies” with the establishment of a global conference, a journal, and publication of this anthology, all in recent years. Metal Rules the Globe took the editors’ a decade to compile. That they have thought deeply about how they want the collection to speak shows through in the book’s thoughtful arrangement and design, and in the way in which they draw on the contributions herein to develop for the field a research agenda that will take it forward...
Resumo:
This chapter investigates the capacity of a well-supported holistic ePortfolio program, the QUT Student ePortfolio Program (QSeP), to support critical reflection for pedagogic innovation in higher education, by exploring practice examples. The chapter looks across faculty and discipline areas to illustrate a range of ePortfolio learning case studies, which have led pedagogical innovation across a whole institution, to enhance student learning and support academic teaching. The ePortfolio strategies discussed support innovation in learning and teaching where academics use the ePortfolio approach in different ways to develop connectedness (productive pedagogies) within learning. Students are supported to develop awareness of the connections between formal and informal learning opportunities and between their learning and personal and professional goals. Students are guided to understand what they have learned and how they have learned in terms of generic employability skills or graduate attributes and also in relation to professional standards and competencies and personal goals. In essence, the ePortfolio-supported pedagogy creates capstone events enabling students to develop a professional identity and understanding of ongoing professional development. The examples are drawn from distinct discipline areas and illustrate the capacity of ePortfolio to underpin pedagogic innovation across discipline areas: • Bachelor of Information Technology—the ePortfolio approach supports students to explore the IT industry as a means of clarifying personal expectations and goals, thereby enhancing student potential in the course c• Bachelor of Nursing and Master of Nursing Science—students develop a professional ePortfolio to show development of the nursing competencies • Master of Information Technology—Library and Information students compile a Professional Portfolio for assessment in the Professional Practice subject • Bachelor of Laws—Virtual Law Placement (VLP) is a unit of study that challenges students to critically reflect on their performance and development duringthe work placement Each case study illustrates the academic teaching goal and student ePortfolio task in context. Issues, challenges and support strategies are identified. Comments from the students and their lecturers give an indication of the effectiveness of the ePortfolio approach to meet learning and teaching goals.
Resumo:
The methodology of designing normative terminological products has been described in several guides and international standards. However, this methodology is not always applicable to designing translation-oriented terminological products which differ greatly from normative ones in terms of volume, function, and primary target group. This dissertation has three main goals. The first is to revise and enrich the stock of concepts and terms required in the process of designing an LSP dictionary for translators. The second is to detect, classify, and describe the factors which determine the characteristics of an LSP dictionary for translators and affect the process of its compilation. The third goal is to provide recommendations on different aspects of dictionary design. The study is based on an analysis of dictionaries, dictionary reviews, literature on translation-oriented lexicography, material from several dictionary projects, and the results of questionnaires. Thorough analysis of the concept of a dictionary helped us to compile a list of designable characteristics of a dictionary. These characteristics include target group, function, links to other resources, data carrier, list of lemmata, information about the lemmata, composition of other parts of the dictionary, compression of the data, structure of the data, and access structure. The factors which determine the characteristics of a dictionary have been divided into those derived from the needs of the intended users and those reflecting the restrictions of the real world (e.g. characteristics of the data carrier and organizational factors) and attitudes (e.g. traditions and scientific paradigms). The designer of a dictionary is recommended to take the intended users' needs as the starting point and aim at finding the best compromise between the conflicting factors. When designing an LSP dictionary, much depends on the level of knowledge of the intended users about the domain in question as well as their general linguistic competence, LSP competence, and lexicographic competence. This dissertation discusses the needs of LSP translators and the role of the dictionary in the process of translation of an LSP text. It also emphasizes the importance of planning lexicographic products and activities, and addresses many practical aspects of dictionary design.
Resumo:
The strategic objectives of Turf Australia (formerly the Turf Producers Association (TPA)) relating to water use in turf are to: • Source and collate information to support the case for adequate access to water for the Turf production and maintenance sectors and • Compile information generated into a convincing communication package that can be readily used by the industry in its advocacy programs (to government, regulators, media etc) More specifically, the turfgrass industry needs unbiased scientific evidence of the value of healthy grass in our environment. It needs to promote the use of adequate water even during drought periods to maintain quality turfgrass, which provides many benefits to the broader community including cooling the environment, saving energy and encouraging healthy lifestyles. The many environmental, social and health benefits of living turfgrass have been the subject of numerous investigations beyond the scope of this review. However further research is needed to fully understand the economic returns achievable by the judicious use of water for the maintenance of healthy turfgrass. Consumer education, backed by scientific evidence will highlight the “false economy” in allowing turfgrass to wither and die during conditions which require high level water restrictions. This report presents a review of the literature pertaining to research in the field of turf water use. The purpose of the review was to better understand the scope and nature of existing research results on turf water relations so that knowledge gaps could be identified in achieving the above strategic objectives of the TPA. Research to date has been found to be insufficient to compile a convincing communication package as described. However, identified knowledge gaps can now be addressed through targeted research. Information derived from targeted research will provide valuable material for education of the end user of turfgrass. Recommendations have been developed, based on the results of this desktop review. It was determined that future research in the field of turf irrigation needs to focus on a number of key factors which directly or indirectly affect the relationship between turfgrass and water use. These factors are: • Climate • Cultivar • Quality • Site use requirements • Establishment and management The overarching recommendation is to develop a strategic plan for turfgrass water relations research based around the five determinants of turf water use listed above. This plan should ensure research under these five categories is integrated into a holistic approach by which the consumer can be guided in species and/or cultivar choices as well as best management practices with respect to turfgrass water relations. Worsening drought cycles and limited supply of water for irrigation were the key factors driving every research project reviewed in this report. Subsidence of the most recent (or current) drought conditions in Australia should not be viewed by the turf industry as a reason to withdraw support or funding for research in this area. Drought conditions, limited domestic water availability and urban water restrictions will return in Australia albeit in 5, 10 or 20 years time and the turf industry has an opportunity to prepare for that time.
Resumo:
AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.
Resumo:
This thesis studies empirically whether measurement errors in aggregate production statistics affect sentiment and future output. Initial announcements of aggregate production are subject to measurement error, because many of the data required to compile the statistics are produced with a lag. This measurement error can be gauged as the difference between the latest revised statistic and its initial announcement. Assuming aggregate production statistics help forecast future aggregate production, these measurement errors are expected to affect macroeconomic forecasts. Assuming agents’ macroeconomic forecasts affect their production choices, these measurement errors should affect future output through sentiment. This thesis is primarily empirical, so the theoretical basis, strategic complementarity, is discussed quite briefly. However, it is a model in which higher aggregate production increases each agent’s incentive to produce. In this circumstance a statistical announcement which suggests aggregate production is high would increase each agent’s incentive to produce, thus resulting in higher aggregate production. In this way the existence of strategic complementarity provides the theoretical basis for output fluctuations caused by measurement mistakes in aggregate production statistics. Previous empirical studies suggest that measurement errors in gross national product affect future aggregate production in the United States. Additionally it has been demonstrated that measurement errors in the Index of Leading Indicators affect forecasts by professional economists as well as future industrial production in the United States. This thesis aims to verify the applicability of these findings to other countries, as well as study the link between measurement errors in gross domestic product and sentiment. This thesis explores the relationship between measurement errors in gross domestic production and sentiment and future output. Professional forecasts and consumer sentiment in the United States and Finland, as well as producer sentiment in Finland, are used as the measures of sentiment. Using statistical techniques it is found that measurement errors in gross domestic product affect forecasts and producer sentiment. The effect on consumer sentiment is ambiguous. The relationship between measurement errors and future output is explored using data from Finland, United States, United Kingdom, New Zealand and Sweden. It is found that measurement errors have affected aggregate production or investment in Finland, United States, United Kingdom and Sweden. Specifically, it was found that overly optimistic statistics announcements are associated with higher output and vice versa.
Resumo:
Large-grain synchronous dataflow graphs or multi-rate graphs have the distinct feature that the nodes of the dataflow graph fire at different rates. Such multi-rate large-grain dataflow graphs have been widely regarded as a powerful programming model for DSP applications. In this paper we propose a method to minimize buffer storage requirement in constructing rate-optimal compile-time (MBRO) schedules for multi-rate dataflow graphs. We demonstrate that the constraints to minimize buffer storage while executing at the optimal computation rate (i.e. the maximum possible computation rate without storage constraints) can be formulated as a unified linear programming problem in our framework. A novel feature of our method is that in constructing the rate-optimal schedule, it directly minimizes the memory requirement by choosing the schedule time of nodes appropriately. Lastly, a new circular-arc interval graph coloring algorithm has been proposed to further reduce the memory requirement by allowing buffer sharing among the arcs of the multi-rate dataflow graph. We have constructed an experimental testbed which implements our MBRO scheduling algorithm as well as (i) the widely used periodic admissible parallel schedules (also known as block schedules) proposed by Lee and Messerschmitt (IEEE Transactions on Computers, vol. 36, no. 1, 1987, pp. 24-35), (ii) the optimal scheduling buffer allocation (OSBA) algorithm of Ning and Gao (Conference Record of the Twentieth Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, Charleston, SC, Jan. 10-13, 1993, pp. 29-42), and (iii) the multi-rate software pipelining (MRSP) algorithm (Govindarajan and Gao, in Proceedings of the 1993 International Conference on Application Specific Array Processors, Venice, Italy, Oct. 25-27, 1993, pp. 77-88). Schedules generated for a number of random dataflow graphs and for a set of DSP application programs using the different scheduling methods are compared. The experimental results have demonstrated a significant improvement (10-20%) in buffer requirements for the MBRO schedules compared to the schedules generated by the other three methods, without sacrificing the computation rate. The MBRO method also gives a 20% average improvement in computation rate compared to Lee's Block scheduling method.
Resumo:
Energy consumption has become a major constraint in providing increased functionality for devices with small form factors. Dynamic voltage and frequency scaling has been identified as an effective approach for reducing the energy consumption of embedded systems. Earlier works on dynamic voltage scaling focused mainly on performing voltage scaling when the CPU is waiting for memory subsystem or concentrated chiefly on loop nests and/or subroutine calls having sufficient number of dynamic instructions. This paper concentrates on coarser program regions and for the first time uses program phase behavior for performing dynamic voltage scaling. Program phases are annotated at compile time with mode switch instructions. Further, we relate the Dynamic Voltage Scaling Problem to the Multiple Choice Knapsack Problem, and use well known heuristics to solve it efficiently. Also, we develop a simple integer linear program formulation for this problem. Experimental evaluation on a set of media applications reveal that our heuristic method obtains a 38% reduction in energy consumption on an average, with a performance degradation of 1% and upto 45% reduction in energy with a performance degradation of 5%. Further, the energy consumed by the heuristic solution is within 1% of the optimal solution obtained from the ILP approach.
Resumo:
Electricity appears to be the energy carrier of choice for modern economics since growth in electricity has outpaced growth in the demand for fuels. A decision maker (DM) for accurate and efficient decisions in electricity distribution requires the sector wise and location wise electricity consumption information to predict the requirement of electricity. In this regard, an interactive computer-based Decision Support System (DSS) has been developed to compile, analyse and present the data at disaggregated levels for regional energy planning. This helps in providing the precise information needed to make timely decisions related to transmission and distribution planning leading to increased efficiency and productivity. This paper discusses the design and implementation of a DSS, which facilitates to analyse the consumption of electricity at various hierarchical levels (division, taluk, sub division, feeder) for selected periods. This DSS is validated with the data of transmission and distribution systems of Kolar district in Karnataka State, India.
Resumo:
Superscalar processors currently have the potential to fetch multiple basic blocks per cycle by employing one of several recently proposed instruction fetch mechanisms. However, this increased fetch bandwidth cannot be exploited unless pipeline stages further downstream correspondingly improve. In particular,register renaming a large number of instructions per cycle is diDcult. A large instruction window, needed to receive multiple basic blocks per cycle, will slow down dependence resolution and instruction issue. This paper addresses these and related issues by proposing (i) partitioning of the instruction window into multiple blocks, each holding a dynamic code sequence; (ii) logical partitioning of the registerjle into a global file and several local jles, the latter holding registers local to a dynamic code sequence; (iii) the dynamic recording and reuse of register renaming information for registers local to a dynamic code sequence. Performance studies show these mechanisms improve performance over traditional superscalar processors by factors ranging from 1.5 to a little over 3 for the SPEC Integer programs. Next, it is observed that several of the loops in the benchmarks display vector-like behavior during execution, even if the static loop bodies are likely complex for compile-time vectorization. A dynamic loop vectorization mechanism that builds on top of the above mechanisms is briefly outlined. The mechanism vectorizes up to 60% of the dynamic instructions for some programs, albeit the average number of iterations per loop is quite small.
Resumo:
Polyhedral techniques for program transformation are now used in several proprietary and open source compilers. However, most of the research on polyhedral compilation has focused on imperative languages such as C, where the computation is specified in terms of statements with zero or more nested loops and other control structures around them. Graphical dataflow languages, where there is no notion of statements or a schedule specifying their relative execution order, have so far not been studied using a powerful transformation or optimization approach. The execution semantics and referential transparency of dataflow languages impose a different set of challenges. In this paper, we attempt to bridge this gap by presenting techniques that can be used to extract polyhedral representation from dataflow programs and to synthesize them from their equivalent polyhedral representation. We then describe PolyGLoT, a framework for automatic transformation of dataflow programs which we built using our techniques and other popular research tools such as Clan and Pluto. For the purpose of experimental evaluation, we used our tools to compile LabVIEW, one of the most widely used dataflow programming languages. Results show that dataflow programs transformed using our framework are able to outperform those compiled otherwise by up to a factor of seventeen, with a mean speed-up of 2.30x while running on an 8-core Intel system.
Resumo:
Task-parallel languages are increasingly popular. Many of them provide expressive mechanisms for intertask synchronization. For example, OpenMP 4.0 will integrate data-driven execution semantics derived from the StarSs research language. Compared to the more restrictive data-parallel and fork-join concurrency models, the advanced features being introduced into task-parallelmodels in turn enable improved scalability through load balancing, memory latency hiding, mitigation of the pressure on memory bandwidth, and, as a side effect, reduced power consumption. In this article, we develop a systematic approach to compile loop nests into concurrent, dynamically constructed graphs of dependent tasks. We propose a simple and effective heuristic that selects the most profitable parallelization idiom for every dependence type and communication pattern. This heuristic enables the extraction of interband parallelism (cross-barrier parallelism) in a number of numerical computations that range from linear algebra to structured grids and image processing. The proposed static analysis and code generation alleviates the burden of a full-blown dependence resolver to track the readiness of tasks at runtime. We evaluate our approach and algorithms in the PPCG compiler, targeting OpenStream, a representative dataflow task-parallel language with explicit intertask dependences and a lightweight runtime. Experimental results demonstrate the effectiveness of the approach.