988 resultados para Predictive Monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of predicting the outcome of an ongoing case of a business process based on event logs. In this setting, the outcome of a case may refer for example to the achievement of a performance objective or the fulfillment of a compliance rule upon completion of the case. Given a log consisting of traces of completed cases, given a trace of an ongoing case, and given two or more possible out- comes (e.g., a positive and a negative outcome), the paper addresses the problem of determining the most likely outcome for the case in question. Previous approaches to this problem are largely based on simple symbolic sequence classification, meaning that they extract features from traces seen as sequences of event labels, and use these features to construct a classifier for runtime prediction. In doing so, these approaches ignore the data payload associated to each event. This paper approaches the problem from a different angle by treating traces as complex symbolic sequences, that is, sequences of events each carrying a data payload. In this context, the paper outlines different feature encodings of complex symbolic sequences and compares their predictive accuracy on real-life business process event logs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This article is an attempt to devise a method of using certain species of Corixidae as a basis for the assessment of general water quality in lakes. An empirical graphical representation of the distribution of populations or communities of Corixidae in relation to conductivity, based mainly on English and Welsh lakes, is used as a predictive monitoring model to establish the "expected" normal community at a given conductivity, representing the total ionic concentration of the water body. A test sample from another lake of known conductivity is then compared with "expected" community. The "goodness of fit" is examined visually or by calculation of indices of similarity based on the relative proportions of the constituent species of each community. A computer programme has been devised for this purpose.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Recurrent nerve injury is 1 of the most important complications of thyroidectomy. During the last decade, nerve monitoring has gained increasing acceptance in several centers as a method to predict and to document nerve function at the end of the operation. We evaluated the efficacy of a nerve monitoring system in a series of patients who underwent thyroidectomy and critically analyzed the negative predictive value (NPV) and positive predictive value (PPV) of the method. Methods. NIM System efficacy was prospectively analyzed in 447 patients who underwent thyroidectomy between 2001 and 2008 (366 female/81 male; 420 white/47 nonwhite; 11 to 82 years of age; median, 43 years old). There were 421 total thyroidectomies and 26 partial thyroidectomies, leading to 868 nerves at risk. The gold standard to evaluate inferior laryngeal nerve function was early postoperative videolaryngoscopy, which was repeated after 4 to 6 months in all patients with abnormal endoscopic findings. Results. At the early evaluation, 858 nerves (98.8%) presented normal videolaryngoscopic features after surgery. Ten paretic/paralyzed nerves (1.2%) were detected (2 unexpected unilateral paresis, 2 unexpected bilateral paresis, 1 unexpected unilateral paralysis, 1 unexpected bilateral paralyses, and 1 expected unilateral paralysis). At the late videolaryngoscopy, only 2 permanent nerve paralyses were noted (0.2%), with an ultimate result of 99.8% functioning nerves. Nerve monitoring showed absent or markedly reduced electrical activity at the end of the operations in 25/868 nerves (2.9%), including all 10 endoscopically compromised nerves, with 15 false-positive results. There were no false-negative results. Therefore, the PPV was 40.0%, and the NPV was 100%. Conclusions. In the present series, nerve monitoring had a very high PPV but a low NPV for the detection of recurrent nerve injury. (C) 2011 Wiley Periodicals, Inc. Head Neck 34: 175-179, 2012

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION: The incidence of bloodstream infection (BSI) in extracorporeal life support (ECLS) is reported between 0.9 and 19.5%. In January 2006, the Extracorporeal Life Support Organization (ELSO) reported an overall incidence of 8.78% distributed as follows: respiratory: 6.5% (neonatal), 20.8% (pediatric); cardiac: 8.2% (neonatal) and 12.6% (pediatric). METHOD: At BC Children's Hospital (BCCH) daily surveillance blood cultures (BC) are performed and antibiotic prophylaxis is not routinely recommended. Positive BC (BC+) were reviewed, including resistance profiles, collection time of BC+, time to positivity and mortality. White blood cell count, absolute neutrophile count, immature/total ratio, platelet count, fibrinogen and lactate were analyzed 48, 24 and 0 h prior to BSI. A univariate linear regression analysis was performed. RESULTS: From 1999 to 2005, 89 patients underwent ECLS. After exclusion, 84 patients were reviewed. The attack rate was 22.6% (19 BSI) and 13.1% after exclusion of coagulase-negative staphylococci (n = 8). BSI patients were significantly longer on ECLS (157 h) compared to the no-BSI group (127 h, 95% CI: 106-148). Six BSI patients died on ECLS (35%; 4 congenital diaphragmatic hernias, 1 hypoplastic left heart syndrome and 1 after a tetralogy repair). BCCH survival on ECLS was 71 and 58% at discharge, which is comparable to previous reports. No patient died primarily because of BSI. No BSI predictor was identified, although lactate may show a decreasing trend before BSI (P = 0.102). CONCLUSION: Compared with ELSO, the studied BSI incidence was higher with a comparable mortality. We speculate that our BSI rate is explained by underreporting of "contaminants" in the literature, the use of broad-spectrum antibiotic prophylaxis and a higher yield with daily monitoring BC. We support daily surveillance blood cultures as an alternative to antibiotic prophylaxis in the management of patients on ECLS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1984, the International Agency for Research on Cancer determined that working in the primary aluminium production process was associated with exposure to certain polycyclic aromatic hydrocarbons (PAHs) that are probably carcinogenic to humans. Key sources of PAH exposure within the occupational environment of a prebake aluminium smelter are processes associated with use of coal-tar pitch. Despite the potential for exposure via inhalation, ingestion and dermal adsorption, to date occupational exposure limits exist only for airborne contaminants. This study, based at a prebake aluminium smelter in Queensland, Australia, compares exposures of workers who came in contact with PAHs from coal-tar pitch in the smelter’s anode plant (n = 69) and cell-reconstruction area (n = 28), and a non-production control group (n = 17). Literature relevant to PAH exposures in industry and methods of monitoring and assessing occupational hazards associated with these compounds are reviewed, and methods relevant to PAH exposure are discussed in the context of the study site. The study utilises air monitoring of PAHs to quantify exposure via the inhalation route and biological monitoring of 1-hydroxypyrene (1-OHP) in urine of workers to assess total body burden from all routes of entry. Exposures determined for similar exposure groups, sampled over three years, are compared with published occupational PAH exposure limits and/or guidelines. Results of paired personal air monitoring samples and samples collected for 1-OHP in urine monitoring do not correlate. Predictive ability of the benzene-soluble fraction (BSF) in personal air monitoring in relation to the 1-OHP levels in urine is poor (adjusted R2 < 1%) even after adjustment for potential confounders of smoking status and use of personal protective equipment. For static air BSF levels in the anode plant, the median was 0.023 mg/m3 (range 0.002–0.250), almost twice as high as in the cell-reconstruction area (median = 0.013 mg/m3, range 0.003–0.154). In contrast, median BSF personal exposure in the anode plant was 0.036 mg/m3 (range 0.003–0.563), significantly lower than the median measured in the reconstruction area (0.054 mg/m3, range 0.003–0.371) (p = 0.041). The observation that median 1-OHP levels in urine were significantly higher in the anode plant than in the reconstruction area (6.62 µmol/mol creatinine, range 0.09–33.44 and 0.17 µmol/mol creatinine, range 0.001–2.47, respectively) parallels the static air measurements of BSF rather than the personal air monitoring results (p < 0.001). Results of air measurements and biological monitoring show that tasks associated with paste mixing and anode forming in the forming area of the anode plant resulted in higher PAH exposure than tasks in the non-forming areas; median 1-OHP levels in urine from workers in the forming area (14.20 µmol/mol creatinine, range 2.02–33.44) were almost four times higher than those obtained from workers in the non-forming area (4.11 µmol/mol creatinine, range 0.09–26.99; p < 0.001). Results justify use of biological monitoring as an important adjunct to existing measures of PAH exposure in the aluminium industry. Although monitoring of 1-OHP in urine may not be an accurate measure of biological effect on an individual, it is a better indicator of total PAH exposure than BSF in air. In January 2005, interim study results prompted a plant management decision to modify control measures to reduce skin exposure. Comparison of 1-OHP in urine from workers pre- and post-modifications showed substantial downward trends. Exposure via the dermal route was identified as a contributor to overall dose. Reduction in 1-OHP urine concentrations achieved by reducing skin exposure demonstrate the importance of exposure via this alternative pathway. Finally, control measures are recommended to ameliorate risk associated with PAH exposure in the primary aluminium production process, and suggestions for future research include development of methods capable of more specifically monitoring carcinogenic constituents of PAH mixtures, such as benzo[a]pyrene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autonomous Underwater Vehicles (AUVs) are revolutionizing oceanography through their versatility, autonomy and endurance. However, they are still an underutilized technology. For coastal operations, the ability to track a certain feature is of interest to ocean scientists. Adaptive and predictive path planning requires frequent communication with significant data transfer. Currently, most AUVs rely on satellite phones as their primary communication. This communication protocol is expensive and slow. To reduce communication costs and provide adequate data transfer rates, we present a hardware modification along with a software system that provides an alternative robust disruption- tolerant communications framework enabling cost-effective glider operation in coastal regions. The framework is specifically designed to address multi-sensor deployments. We provide a system overview and present testing and coverage data for the network. Additionally, we include an application of ocean-model driven trajectory design, which can benefit from the use of this network and communication system. Simulation and implementation results are presented for single and multiple vehicle deployments. The presented combination of infrastructure, software development and deployment experience brings us closer to the goal of providing a reliable and cost-effective data transfer framework to enable real-time, optimal trajectory design, based on ocean model predictions, to gather in situ measurements of interesting and evolving ocean features and phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To develop, using dacarbazine as a model, reliable techniques for measuring DNA damage and repair as pharmacodynamic endpoints for patients receiving chemotherapy. Methods: A group of 39 patients with malignant melanoma were treated with dacarbazine 1 g/m2 i.v. every 21 days. Tamoxifen 20 mg daily was commenced 24 h after the first infusion and continued until 3 weeks after the last cycle of chemotherapy. DNA strand breaks formed during dacarbazine-induced DNA damage and repair were measured in individual cells by the alkaline comet assay. DNA methyl adducts were quantified by measuring urinary 3-methyladenine (3-MeA) excretion using immunoaffinity ELISA. Venous blood was taken on cycles 1 and 2 for separation of peripheral blood lymphocytes (PBLs) for measurement of DNA strand breaks. Results: Wide interpatient variation in PBL DNA strand breaks occurred following chemotherapy, with a peak at 4 h (median 26.6 h, interquartile range 14.75- 40.5 h) and incomplete repair by 24 h. Similarly, there was a range of 3-MeA excretion with peak levels 4-10 h after chemotherapy (median 33 nmol/h, interquartile range 20.448.65 nmol/h). Peak 3-MeA excretion was positively correlated with DNA strand breaks at 4 h (Spearman's correlation coefficient, r = 0.39, P = 0.036) and 24 h (r = 0.46, P = 0.01). Drug-induced emesis correlated with PBL DNA strand breaks (Mann Whitney U-test, P = 0.03) but not with peak 3-MeA excretion. Conclusions: DNA damage and repair following cytotoxic chemotherapy can be measured in vivo by the alkaline comet assay and by urinary 3-MeA excretion in patients receiving chemotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increased interest on the use of Unmanned Aerial Vehicles (UAVs) for wildlife and feral animal monitoring around the world. This paper describes a novel system which uses a predictive dynamic application that places the UAV ahead of a user, with a low cost thermal camera, a small onboard computer that identifies heat signatures of a target animal from a predetermined altitude and transmits that target’s GPS coordinates. A map is generated and various data sets and graphs are displayed using a GUI designed for easy use. The paper describes the hardware and software architecture and the probabilistic model for downward facing camera for the detection of an animal. Behavioral dynamics of target movement for the design of a Kalman filter and Markov model based prediction algorithm are used to place the UAV ahead of the user. Geometrical concepts and Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of the user, thus delivering a new way point for autonomous navigation. Results show that the system is capable of autonomously locating animals from a predetermined height and generate a map showing the location of the animals ahead of the user.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Taita Hills in southeastern Kenya form the northernmost part of Africa’s Eastern Arc Mountains, which have been identified by Conservation International as one of the top ten biodiversity hotspots on Earth. As with many areas of the developing world, over recent decades the Taita Hills have experienced significant population growth leading to associated major changes in land use and land cover (LULC), as well as escalating land degradation, particularly soil erosion. Multi-temporal medium resolution multispectral optical satellite data, such as imagery from the SPOT HRV, HRVIR, and HRG sensors, provides a valuable source of information for environmental monitoring and modelling at a landscape level at local and regional scales. However, utilization of multi-temporal SPOT data in quantitative remote sensing studies requires the removal of atmospheric effects and the derivation of surface reflectance factor. Furthermore, for areas of rugged terrain, such as the Taita Hills, topographic correction is necessary to derive comparable reflectance throughout a SPOT scene. Reliable monitoring of LULC change over time and modelling of land degradation and human population distribution and abundance are of crucial importance to sustainable development, natural resource management, biodiversity conservation, and understanding and mitigating climate change and its impacts. The main purpose of this thesis was to develop and validate enhanced processing of SPOT satellite imagery for use in environmental monitoring and modelling at a landscape level, in regions of the developing world with limited ancillary data availability. The Taita Hills formed the application study site, whilst the Helsinki metropolitan region was used as a control site for validation and assessment of the applied atmospheric correction techniques, where multiangular reflectance field measurements were taken and where horizontal visibility meteorological data concurrent with image acquisition were available. The proposed historical empirical line method (HELM) for absolute atmospheric correction was found to be the only applied technique that could derive surface reflectance factor within an RMSE of < 0.02 ps in the SPOT visible and near-infrared bands; an accuracy level identified as a benchmark for successful atmospheric correction. A multi-scale segmentation/object relationship modelling (MSS/ORM) approach was applied to map LULC in the Taita Hills from the multi-temporal SPOT imagery. This object-based procedure was shown to derive significant improvements over a uni-scale maximum-likelihood technique. The derived LULC data was used in combination with low cost GIS geospatial layers describing elevation, rainfall and soil type, to model degradation in the Taita Hills in the form of potential soil loss, utilizing the simple universal soil loss equation (USLE). Furthermore, human population distribution and abundance were modelled with satisfactory results using only SPOT and GIS derived data and non-Gaussian predictive modelling techniques. The SPOT derived LULC data was found to be unnecessary as a predictor because the first and second order image texture measurements had greater power to explain variation in dwelling unit occurrence and abundance. The ability of the procedures to be implemented locally in the developing world using low-cost or freely available data and software was considered. The techniques discussed in this thesis are considered equally applicable to other medium- and high-resolution optical satellite imagery, as well the utilized SPOT data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four species of large mackerels (Scomberomorus spp.) co-occur in the waters off northern Australia and are important to fisheries in the region. State fisheries agencies monitor these species for fisheries assessment; however, data inaccuracies may exist due to difficulties with identification of these closely related species, particularly when specimens are incomplete from fish processing. This study examined the efficacy of using otolith morphometrics to differentiate and predict among the four mackerel species off northeastern Australia. Seven otolith measurements and five shape indices were recorded from 555 mackerel specimens. Multivariate modelling including linear discriminant analysis (LDA) and support vector machines, successfully differentiated among the four species based on otolith morphometrics. Cross validation determined a predictive accuracy of at least 96% for both models. An optimum predictive model for the four mackerel species was an LDA model that included fork length, feret length, feret width, perimeter, area, roundness, form factor and rectangularity as explanatory variables. This analysis may improve the accuracy of fisheries monitoring, the estimates based on this monitoring (i.e. mortality rate) and the overall management of mackerel species in Australia.