869 resultados para Fishery management -- Queensland -- Mathematical models
Resumo:
At present, in the University curricula in most countries, the decision theory and the mathematical models to aid decision making is not included, as in the graduate program like in Doctored and Master´s programs. In the Technical School of High Level Agronomic Engineers of the Technical University of Madrid (ETSIA-UPM), the need to offer to the future engineers training in a subject that could help them to take decisions in their profession was felt. Along the life, they will have to take a lot of decisions. Ones, will be important and others no. In the personal level, they will have to take several very important decisions, like the election of a career, professional work, or a couple, but in the professional field, the decision making is the main role of the Managers, Politicians and Leaders. They should be decision makers and will be paid for it. Therefore, nobody can understand that such a professional that is called to practice management responsibilities in the companies, does not take training in such an important matter. For it, in the year 2000, it was requested to the University Board to introduce in the curricula an optional qualified subject of the second cycle with 4,5 credits titled " Mathematical Methods for Making Decisions ". A program was elaborated, the didactic material prepared and programs as Maple, Lingo, Math Cad, etc. installed in several IT classrooms, where the course will be taught. In the course 2000-2001 this subject was offered with a great acceptance that exceeded the forecasts of capacity and had to be prepared more classrooms. This course in graduate program took place in the Department of Applied Mathematics to the Agronomic Engineering, as an extension of the credits dedicated to Mathematics in the career of Engineering.
Resumo:
In Queensland, stout whiting are fished by Danish seine and fish otter-trawl methods between Sandy Cape and the Queensland-New South Wales border. The fishery is currently identified by a T4 symbol and is operated by two primary quota holders. Since 1997, T4 management has been informed by annual stock assessments in order to determine a total allowable commercial catch (TACC) quota. The TACC is assessed before the start of each fishing year using statistical methodologies. This includes evaluation of trends in fish catch-rates and catch-at-age frequencies against management reference points. The T4 stout whiting TACC for 2014 was adjusted down to 1150 t as a result of elevated estimates of fishing mortality and remained unchanged in 2015 (2013 TACC = 1350 t quota). Two T4 vessels fished for stout whiting in the 2015 fishing year, harvesting 663 t from Queensland waters. Annual T4 landings of stout whiting averaged about 713 t for the fishing years 2013–2015, with a maximum harvest in the last 10 fishing years of 1140 t and a maximum historical harvest of 2400 t in the 1995. Stout whiting catch rates from both Queensland and New South Wales were analysed for all vessels, areas and fishing gears. The 2015 catch rate index was equal to 0.85, down 15% compared to the 2010–2015 fishing year average (reference point =1). The stout whiting fish length and otolith weight frequencies indicated larger and older fish in the calendar year 2014. This data was translated to show improved measures of fish survival at about 38% per year and near the reference point of about 41%. Together, the stout whiting catch rate and survival indicators show the fishery was sustainable. Earlier population modelling conducted for the year 2013 also suggested the stock was sustainable, but the estimate was only marginally above the biomass for maximum sustainable yield. Irrespective, reasons for reduced catch rates should be examined further and interpreted with precaution, particularly given the TACC has been under-caught in many years. For setting of the 2016 TACC, alternate analyses and reference points were compared to address data uncertainties and provide options for quota change. The results were dependent on the stock indicator and harvest procedure used. Uncertainty in all TACC estimates should be considered as they were sensitive to the data inputs and assumptions. For the 2016 T4 fishing year, upper levels of harvest should be limited to 1000–1100 t following procedure equation 1, with target levels of harvest at 750–850 t for procedure equation 2. Use of these estimates to set TACC will depend on management and industry intentions.
Resumo:
We developed orthogonal least-squares techniques for fitting crystalline lens shapes, and used the bootstrap method to determine uncertainties associated with the estimated vertex radii of curvature and asphericities of five different models. Three existing models were investigated including one that uses two separate conics for the anterior and posterior surfaces, and two whole lens models based on a modulated hyperbolic cosine function and on a generalized conic function. Two new models were proposed including one that uses two interdependent conics and a polynomial based whole lens model. The models were used to describe the in vitro shape for a data set of twenty human lenses with ages 7–82 years. The two-conic-surface model (7 mm zone diameter) and the interdependent surfaces model had significantly lower merit functions than the other three models for the data set, indicating that most likely they can describe human lens shape over a wide age range better than the other models (although with the two-conic-surfaces model being unable to describe the lens equatorial region). Considerable differences were found between some models regarding estimates of radii of curvature and surface asphericities. The hyperbolic cosine model and the new polynomial based whole lens model had the best precision in determining the radii of curvature and surface asphericities across the five considered models. Most models found significant increase in anterior, but not posterior, radius of curvature with age. Most models found a wide scatter of asphericities, but with the asphericities usually being positive and not significantly related to age. As the interdependent surfaces model had lower merit function than three whole lens models, there is further scope to develop an accurate model of the complete shape of human lenses of all ages. The results highlight the continued difficulty in selecting an appropriate model for the crystalline lens shape.
Resumo:
Many large coal mining operations in Australia rely heavily on the rail network to transport coal from mines to coal terminals at ports for shipment. Over the last few years, due to the fast growing demand, the coal rail network is becoming one of the worst industrial bottlenecks in Australia. As a result, this provides great incentives for pursuing better optimisation and control strategies for the operation of the whole rail transportation system under network and terminal capacity constraints. This PhD research aims to achieve a significant efficiency improvement in a coal rail network on the basis of the development of standard modelling approaches and generic solution techniques. Generally, the train scheduling problem can be modelled as a Blocking Parallel- Machine Job-Shop Scheduling (BPMJSS) problem. In a BPMJSS model for train scheduling, trains and sections respectively are synonymous with jobs and machines and an operation is regarded as the movement/traversal of a train across a section. To begin, an improved shifting bottleneck procedure algorithm combined with metaheuristics has been developed to efficiently solve the Parallel-Machine Job- Shop Scheduling (PMJSS) problems without the blocking conditions. Due to the lack of buffer space, the real-life train scheduling should consider blocking or hold-while-wait constraints, which means that a track section cannot release and must hold a train until the next section on the routing becomes available. As a consequence, the problem has been considered as BPMJSS with the blocking conditions. To develop efficient solution techniques for BPMJSS, extensive studies on the nonclassical scheduling problems regarding the various buffer conditions (i.e. blocking, no-wait, limited-buffer, unlimited-buffer and combined-buffer) have been done. In this procedure, an alternative graph as an extension of the classical disjunctive graph is developed and specially designed for the non-classical scheduling problems such as the blocking flow-shop scheduling (BFSS), no-wait flow-shop scheduling (NWFSS), and blocking job-shop scheduling (BJSS) problems. By exploring the blocking characteristics based on the alternative graph, a new algorithm called the topological-sequence algorithm is developed for solving the non-classical scheduling problems. To indicate the preeminence of the proposed algorithm, we compare it with two known algorithms (i.e. Recursive Procedure and Directed Graph) in the literature. Moreover, we define a new type of non-classical scheduling problem, called combined-buffer flow-shop scheduling (CBFSS), which covers four extreme cases: the classical FSS (FSS) with infinite buffer, the blocking FSS (BFSS) with no buffer, the no-wait FSS (NWFSS) and the limited-buffer FSS (LBFSS). After exploring the structural properties of CBFSS, we propose an innovative constructive algorithm named the LK algorithm to construct the feasible CBFSS schedule. Detailed numerical illustrations for the various cases are presented and analysed. By adjusting only the attributes in the data input, the proposed LK algorithm is generic and enables the construction of the feasible schedules for many types of non-classical scheduling problems with different buffer constraints. Inspired by the shifting bottleneck procedure algorithm for PMJSS and characteristic analysis based on the alternative graph for non-classical scheduling problems, a new constructive algorithm called the Feasibility Satisfaction Procedure (FSP) is proposed to obtain the feasible BPMJSS solution. A real-world train scheduling case is used for illustrating and comparing the PMJSS and BPMJSS models. Some real-life applications including considering the train length, upgrading the track sections, accelerating a tardy train and changing the bottleneck sections are discussed. Furthermore, the BPMJSS model is generalised to be a No-Wait Blocking Parallel- Machine Job-Shop Scheduling (NWBPMJSS) problem for scheduling the trains with priorities, in which prioritised trains such as express passenger trains are considered simultaneously with non-prioritised trains such as freight trains. In this case, no-wait conditions, which are more restrictive constraints than blocking constraints, arise when considering the prioritised trains that should traverse continuously without any interruption or any unplanned pauses because of the high cost of waiting during travel. In comparison, non-prioritised trains are allowed to enter the next section immediately if possible or to remain in a section until the next section on the routing becomes available. Based on the FSP algorithm, a more generic algorithm called the SE algorithm is developed to solve a class of train scheduling problems in terms of different conditions in train scheduling environments. To construct the feasible train schedule, the proposed SE algorithm consists of many individual modules including the feasibility-satisfaction procedure, time-determination procedure, tune-up procedure and conflict-resolve procedure algorithms. To find a good train schedule, a two-stage hybrid heuristic algorithm called the SE-BIH algorithm is developed by combining the constructive heuristic (i.e. the SE algorithm) and the local-search heuristic (i.e. the Best-Insertion- Heuristic algorithm). To optimise the train schedule, a three-stage algorithm called the SE-BIH-TS algorithm is developed by combining the tabu search (TS) metaheuristic with the SE-BIH algorithm. Finally, a case study is performed for a complex real-world coal rail network under network and terminal capacity constraints. The computational results validate that the proposed methodology would be very promising because it can be applied as a fundamental tool for modelling and solving many real-world scheduling problems.
Resumo:
One of the prominent topics in Business Service Management is business models for (new) services. Business models are useful for service management and engineering as they provide a broader and more holistic perspective on services. Business models are particularly relevant for service innovation as this requires paying attention to the business models that make new services viable and business model innovation can drive the innovation of new and established services. Before we can have a look at business models for services, we first need to understand what business models are. This is not straight-forward as business models are still not well comprehended and the knowledge about business models is fragmented over different disciplines, such as information systems, strategy, innovation, and entrepreneurship. This whitepaper, ‘Understanding business models,’ introduces readers to business models. This whitepaper contributes to enhancing the understanding of business models, in particular the conceptualisation of business models by discussing and integrating business model definitions, frameworks and archetypes from different disciplines. After reading this whitepaper, the reader will have a well-developed understanding about what business models are and how the concept is sometimes interpreted and used in different ways. It will help the reader in assessing their own understanding of business models and that and of others. This will contribute to a better and more beneficial use of business models, an increase in shared understanding, and making it easier to work with business model techniques and tools.
Resumo:
The focus of the present research was to investigate how Local Governments in Queensland were progressing with the adoption of delineated DM policies and supporting guidelines. The study consulted Local Government representatives and hence, the results reflect their views on these issues. Is adoption occurring? To what degree? Are policies and guidelines being effectively implemented so that the objective of a safer, more resilient community is being achieved? If not, what are the current barriers to achieving this, and can recommendations be made to overcome these barriers? These questions defined the basis on which the present study was designed and the survey tools developed. While it was recognised that LGAQ and Emergency Management Queensland (EMQ) may have differing views on some reported issues, it was beyond the scope of the present study to canvass those views. The study resolved to document and analyse these questions under the broad themes of: • Building community capacity (notably via community awareness). • Council operationalisation of DM. • Regional partnerships (in mitigation/adaptation). Data was collected via a survey tool comprising two components: • An online questionnaire survey distributed via the LGAQ Disaster Management Alliance (hereafter referred to as the “Alliance”) to DM sections of all Queensland Local Government Councils; and • a series of focus groups with selected Queensland Councils
Resumo:
Purpose – The purpose of this paper is to provide of a review of the theory and models underlying project management (PM) research degrees that encourage reflective learning. Design/methodology/approach – Review of the literature and reflection on the practice of being actively involved in conducting and supervising academic research and disseminating academic output. The paper argues the case for the potential usefulness of reflective academic research to PM practitioners. It also highlights theoretical drivers of and barriers to reflective academic research by PM practitioners. Findings – A reflective learning approach to research can drive practical results though it requires a great deal of commitment and support by both academic and industry partners. Practical implications – This paper suggests how PM practitioners can engage in academic research that has practical outcomes and how to be more effective at disseminating these research outcomes. Originality/value – Advanced academic degrees, in particular those completed by PM practitioners, can validate a valuable source of innovative ideas and approaches that should be more quickly absorbed into the PM profession’s sources of knowledge. The value of this paper is to critically review and facilitate a reduced adaptation time for implementation of useful reflective academic research to industry.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
We construct a two-scale mathematical model for modern, high-rate LiFePO4cathodes. We attempt to validate against experimental data using two forms of the phase-field model developed recently to represent the concentration of Li+ in nano-sized LiFePO4crystals. We also compare this with the shrinking-core based model we developed previously. Validating against high-rate experimental data, in which electronic and electrolytic resistances have been reduced is an excellent test of the validity of the crystal-scale model used to represent the phase-change that may occur in LiFePO4material. We obtain poor fits with the shrinking-core based model, even with fitting based on “effective” parameter values. Surprisingly, using the more sophisticated phase-field models on the crystal-scale results in poorer fits, though a significant parameter regime could not be investigated due to numerical difficulties. Separate to the fits obtained, using phase-field based models embedded in a two-scale cathodic model results in “many-particle” effects consistent with those reported recently.
Resumo:
"This collection of papers offers a broad synopsis of state-of-the-art mathematical methods used in modeling the interaction between tumors and the immune system. These papers were presented at the four-day workshop on Mathematical Models of Tumor-Immune System Dynamics held in Sydney, Australia from January 7th to January 10th, 2013. The workshop brought together applied mathematicians, biologists, and clinicians actively working in the field of cancer immunology to share their current research and to increase awareness of the innovative mathematical tools that are applicable to the growing field of cancer immunology. Recent progress in cancer immunology and advances in immunotherapy suggest that the immune system plays a fundamental role in host defense against tumors and could be utilized to prevent or cure cancer. Although theoretical and experimental studies of tumor-immune system dynamics have a long history, there are still many unanswered questions about the mechanisms that govern the interaction between the immune system and a growing tumor. The multidimensional nature of these complex interactions requires a cross-disciplinary approach to capture more realistic dynamics of the essential biology. The papers presented in this volume explore these issues and the results will be of interest to graduate students and researchers in a variety of fields within mathematical and biological sciences."--Publisher website
Resumo:
Japan's fishery harvest peaked in the late 1980s. To limit the race for fish, each fisherman could be provided with specific catch limits in the form of individual transferable quotas (ITQs). The market for ITQs would also help remove the most inefficient fishers. In this article we estimate the potential cost reduction associated with catch limits, and find that about 300 billion yen or about 3 billion dollars could be saved through the allocation and trading of individual-specific catch shares.
Resumo:
Designed for undergraduate and postgraduate students, academic researchers and industrial practitioners, this book provides comprehensive case studies on numerical computing of industrial processes and step-by-step procedures for conducting industrial computing. It assumes minimal knowledge in numerical computing and computer programming, making it easy to read, understand and follow. Topics discussed include fundamentals of industrial computing, finite difference methods, the Wavelet-Collocation Method, the Wavelet-Galerkin Method, High Resolution Methods, and comparative studies of various methods. These are discussed using examples of carefully selected models from real processes of industrial significance. The step-by-step procedures in all these case studies can be easily applied to other industrial processes without a need for major changes and thus provide readers with useful frameworks for the applications of engineering computing in fundamental research problems and practical development scenarios.
Resumo:
The Office of Urban Management recognises that the values which characterise the SEQ region as 'subtropical' are important determinants of form in urban and regional planning. Subtropical values are those qualities on which our regional identity depends. A built environment which responds positively to these values is a critical ingredient for achieving a desirable future for the region. The Centre for Subtropical Design has undertaken this study to identify the particular set of values which characterises SEQ, and to translate theses values into design principals that will maintain and reinforce the value set. The principles not only apply to the overall balance between the natural environment and the built environment, but can be applied by local government authorities to guide local planning schemes and help shape specific built for outcomes.