748 resultados para a best practice process model
Resumo:
The Center for Disease Control and Prevention (CDC) estimates that more than 2 million patients annually acquire an infection while hospitalized in U.S. hospitals for other health problems, and that 88,000 die as a direct or indirect result of these infections. Infection with Clostridium difficile is the most important common cause of health care associated infectious diarrhea in industrialized countries. The purpose of this study was to explore the cost of current treatment practice of beginning empiric metronidazole treatment for hospitalized patients with diarrhea prior to identification of an infectious agent. The records of 70 hospitalized patients were retrospectively analyzed to determine the pharmacologic treatment, laboratory testing, and radiographic studies ordered and the median cost for each of these was determined. All patients in the study were tested for C. difficile and concurrently started on empiric metronidazole. The median direct cost for metronidazole was $7.25 per patient (95% CI 5.00, 12.721). The median direct cost for laboratory charges was $468.00 (95% CI 339.26, 552.58) and for radiology the median direct cost was $970.00 (95% CI 738.00, 3406.91). Indirect costs, which are far greater than direct costs, were not studied. At St. Luke's, if every hospitalized patient with diarrhea was empirically treated with metronidazole at a median cost of $7.25, the annual direct cost is estimated to be over $9,000.00 plus uncalculated indirect costs. In the U.S., the estimated annual direct cost may be as much as $21,750,000.00, plus indirect costs. ^ An unexpected and significant finding of this study was the inconsistency in testing and treatment of patients with health care associated diarrhea. A best-practice model for C. difficile testing and treatment was not found in the literature review. In addition to the cost savings gained by not routinely beginning empiric treatment with metronidazole, significant savings and improvement in patient care may result from a more consistent approach to the diagnosis and treatment of all patients with health care associated diarrhea. A decision tree model for C. difficile testing and treatment is proposed, but further research is needed to evaluate the decision arms before a validated best practice model can be proposed. ^
Resumo:
The purpose of this thesis is to identify "best practice" recommendations for successful implementation of the EPSDT outreach program at Memorial Health System's Hospital for Children in Colorado Springs through a policy analysis of Medicaid EPSDT services in Colorado. A successful program at Memorial will increase education and awareness of EPSDT services, enrollment, and access to and utilization of health care services for eligible children. Methodology utilized in this study included questionnaires designed for the EPSDT contract administrator and outreach coordinators/workers; analysis of current federal and state policies; and studies conducted at the federal and state level, and by various advocacy groups. The need for this analysis of EPSDT came about in part through an awareness of increasingly high numbers of children in poverty and who are uninsured. Though the percentage of children living in poverty in Colorado is slightly below the national average (see Table 2), according to data analyzed by The Annie E. Casey Foundation, the percentage of children (0-18) living in poverty in Colorado increased from 10% in 2000 to 16% in 2006, a dramatic increase of 60% surpassed by only one other state in the nation (The Annie E. Casey Foundation, 2008). By comparison, the U.S. percentage of children in poverty during the same time frame rose from 17% to 18% (The Annie E. Casey Foundation, 2008). What kind of health care services are available to this vulnerable and growing group of Coloradans, and what are the barriers that affect their enrollment in, access to and utilization of these health care services? Barriers identified included difficulty with the application process; system and process issues; a lack of providers; and a lack of awareness and knowledge of EPSDT. Fiscal restraints and legislation at the federal and state level are also barriers to increasing enrollment and access to services. Outreach services are a critical component of providing EPSDT services, and there were several recommendations regarding outreach and case management that will benefit the program in the future. Through this analysis and identification of a broad range of barriers, a clearer picture emerged of current challenges within the EPSDT program as well as a broad range of strategies and recommendations to address these challenges. Through increased education and advocacy for EPSDT and the services it encompasses; stronger collaboration and cooperation between all groups involved, including providing a Medical Home for all eligible children; and new legislation putting more money and focus on comprehensive health care for low-income uninsured children; enrollment, access to and utilization of developmentally appropriate and quality health care services can be achieved. ^
Resumo:
Public participation is an integral part of Environmental Impact Assessment (EIA), and as such, has been incorporated into regulatory norms. Assessment of the effectiveness of public participation has remained elusive however. This is partly due to the difficulty in identifying appropriate effectiveness criteria. This research uses Q methodology to discover and analyze stakeholder's social perspectives of the effectiveness of EIAs in the Western Cape, South Africa. It considers two case studies (Main Road and Saldanha Bay EIAs) for contextual participant perspectives of the effectiveness based on their experience. It further considers the more general opinion of provincial consent regulator staff at the Department of Environmental Affairs and the Department of Planning (DEA&DP). Two main themes of investigation are drawn from the South African National Environmental Management Act imperative for effectiveness: firstly, the participation procedure, and secondly, the stakeholder capabilities necessary for effective participation. Four theoretical frameworks drawn from planning, politics and EIA theory are adapted to public participation and used to triangulate the analysis and discussion of the revealed social perspectives. They consider citizen power in deliberation, Habermas' preconditions for the Ideal Speech Situation (ISS), a Foucauldian perspective of knowledge, power and politics, and a Capabilities Approach to public participation effectiveness. The empirical evidence from this research shows that the capacity and contextual constraints faced by participants demand the legislative imperatives for effective participation set out in the NEMA. The implementation of effective public participation has been shown to be a complex, dynamic and sometimes nebulous practice. The functional level of participant understanding of the process was found to be significantly wide-ranging with consequences of unequal and dissatisfied stakeholder engagements. Furthermore, the considerable variance of stakeholder capabilities in the South African social context, resulted in inequalities in deliberation. The social perspectives revealed significant differences in participant experience in terms of citizen power in deliberation. The ISS preconditions are highly contested in both the Saldanha EIA case study and the DEA&DP social perspectives. Only one Main Road EIA case study social perspective considered Foucault's notion of governmentality as a reality in EIA public participation. The freedom of control of ones environment, based on a Capabilities approach, is a highly contested notion. Although agreed with in principle, all of the social perspectives indicate that contextual and capacity realities constrain its realisation. This research has shown that Q method can be applied to EIA public participation in South Africa and, with the appropriate research or monitoring applications it could serve as a useful feedback tool to inform best practice public participation.
Resumo:
In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.
Resumo:
Nowadays, PBL is considered a suitable methodology for engineering education. But making the most of this methodology requires some features, such as multidisciplinary, illstructured teamwork and autonomous research that sometimes are not easy to achieve. In fact, traditional university systems, including curricula, teaching methodologies, assessment and regulation, do not help the implementation of these features. Firstly, we look through the main differences found between a traditional system and the Aalborg model, considered a reference point in PBL. Then, this work is aimed at detecting the main obstacles that a standing traditional system presents to PBL implementation. A multifaceted PBL experience, covering three different disciplines, brings us to analyse these difficulties, order them according to its importance and decide which should be the first changes. Finally, we propose a straightforward introduction of generic competences in the curricula aimed at supporting the use of Problem-Based Project-Organized Learning
Resumo:
Ideas concerning problem-based learning (PBL) developed after running different experiences in different Spanish Universities, are discussed. The driver for introducing PBL has been the requirement for studying Mathematics by the Engineering students. A methodology hybrid of problem-based learning for Mathematics in Engineering studies is proposed. The model is a combination of formal lectures, practical and laboratory sessions with autonomous small projects.
Resumo:
The elaboration of a generic decision-making strategy to address the evolution of an emergency situation, from the stages of response to recovery, and including a planning stage, can facilitate timely, effective and consistent decision making by the response organisations at every level within the emergency management structure and between countries, helping to ensure optimal protection of health, environment, and society. The degree of involvement of stakeholders in this process is a key strategic element for strengthening the local preparedness and response and can help a successful countermeasures strategy. A significant progress was made with the multi-national European project EURANOS (2004-2009) which brought together best practice, knowledge and technology to enhance the preparedness for Europe's response to any radiation emergency and long term contamination. The subsequent establishment of a European Technology Platform and the recent launch of the research project NERIS-TP ("Towards a self sustaining European Technology Platform (NERIS-TP) on Preparedness for Nuclear and Radiological Emergency Response and Recovery") are aimed to continue with the remaining tasks for gaining appropriate levels of emergency preparedness at local level in most European countries. One of the objectives of the NERIS-TP project is: Strengthen the preparedness at the local/national level by setting up dedicated fora and developing new tools or adapting the tools developed within the EURANOS projects (such as the governance framework for preparedness, the handbooks on countermeasures, the RODOS system, and the MOIRA DSS for long term contamination in catchments) to meet the needs of local communities. CIEMAT and UPM in close interaction with the Nuclear Safety Council will explore, within this project, the use and application in Spain of such technical tools, including other national tools and information and communication strategies to foster cooperation between local, national and international stakeholders. The aim is identify and involve relevant stakeholders in emergency preparedness to improve the development and implementation of appropriate protection strategies as part of the consequence management and the transition to recovery. In this paper, an overview of the "state of the art" on this area in Spain and the methodology and work Plan proposed by the Spanish group within the project NERIS to grow the stakeholder involvement in the preparedness to emergency response and recovery is presented.
Resumo:
Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.
Resumo:
Automated and semi-automated accessibility evaluation tools are key to streamline the process of accessibility assessment, and ultimately ensure that software products, contents, and services meet accessibility requirements. Different evaluation tools may better fit different needs and concerns, accounting for a variety of corporate and external policies, content types, invocation methods, deployment contexts, exploitation models, intended audiences and goals; and the specific overall process where they are introduced. This has led to the proliferation of many evaluation tools tailored to specific contexts. However, tool creators, who may be not familiar with the realm of accessibility and may be part of a larger project, lack any systematic guidance when facing the implementation of accessibility evaluation functionalities. Herein we present a systematic approach to the development of accessibility evaluation tools, leveraging the different artifacts and activities of a standardized development process model (the Unified Software Development Process), and providing templates of these artifacts tailored to accessibility evaluation tools. The work presented specially considers the work in progress in this area by the W3C/WAI Evaluation and Report Working Group (ERT WG)
Resumo:
Objective The main purpose of this research is the novel use of artificial metaplasticity on multilayer perceptron (AMMLP) as a data mining tool for prediction the outcome of patients with acquired brain injury (ABI) after cognitive rehabilitation. The final goal aims at increasing knowledge in the field of rehabilitation theory based on cognitive affectation. Methods and materials The data set used in this study contains records belonging to 123 ABI patients with moderate to severe cognitive affectation (according to Glasgow Coma Scale) that underwent rehabilitation at Institut Guttmann Neurorehabilitation Hospital (IG) using the tele-rehabilitation platform PREVIRNEC©. The variables included in the analysis comprise the neuropsychological initial evaluation of the patient (cognitive affectation profile), the results of the rehabilitation tasks performed by the patient in PREVIRNEC© and the outcome of the patient after a 3–5 months treatment. To achieve the treatment outcome prediction, we apply and compare three different data mining techniques: the AMMLP model, a backpropagation neural network (BPNN) and a C4.5 decision tree. Results The prediction performance of the models was measured by ten-fold cross validation and several architectures were tested. The results obtained by the AMMLP model are clearly superior, with an average predictive performance of 91.56%. BPNN and C4.5 models have a prediction average accuracy of 80.18% and 89.91% respectively. The best single AMMLP model provided a specificity of 92.38%, a sensitivity of 91.76% and a prediction accuracy of 92.07%. Conclusions The proposed prediction model presented in this study allows to increase the knowledge about the contributing factors of an ABI patient recovery and to estimate treatment efficacy in individual patients. The ability to predict treatment outcomes may provide new insights toward improving effectiveness and creating personalized therapeutic interventions based on clinical evidence.
Resumo:
The competition in markets, the distribution of limited resources based on productivity and performance, and the efficient management of universities are changing the criteria of trust and legitimacy of the educational system in Peru. Universities are perceived more as institutions of the public sector, while the services they offer must rather contribute to the modernization of the emerging society and the knowledge economy. Higher Educations reforms - initiated in the 1980s - have been inspired by the successful university organizations that have managed to change their governance and addressed to transform certain bureaucratic institutions into organizations capable of playing active role in this global competition for resources and best talent. Within this context, Peruvian universities are facing two major challenges: adapting themselves to new global perspectives and being able to develop a better response to society demands, needs and expectations. This article proposes a model of governance system for higher education in Peru that gives a comprehensive solution to these challenges, allowing dealing with the problems of universities for their development and inclusion within the global trends. For this purpose, a holistic and qualitative methodologic approach was developed, considering an integrated method which considered educational reality as a whole, understanding its facts, components and elements that affects its outcomes. It is proposed to define a policy for university education in Peru that permeates society, by changing the planning model from a social reform model to a policy analysis model, where the Peruvian State acts as sole responsible for responding to the demanding society as its legal representative complemented with some external and independent bodies that define the basis of best practice, as it is being done in many university models worldwide.
Resumo:
A dissertação tem como base a importância do entendimento a respeito dos relacionamentos organizacionais para uma abordagem segmentada dos públicos na comunicação empresarial. A partir de uma reflexão teórica sobre o assunto e da observação de práticas atuais de mercado, foram estabelecidos parâmetros que contribuem para uma conceituação mais precisa dos interlocutores das corporações, no sentido de prover suas demandas informacionais. Tanto na análise das obras consultadas quanto na avaliação dos resultados da pesquisa com empresas de tradição na área de comunicação, demonstrou-se que há lacunas importantes a serem preenchidas. Entes elas, a inexistência de mecanismos que possam aferir com maior precisão as expectativas dos vários segmentos de público em relação à comunicação das empresas, em uma via de mão-dupla, bem como a falta de canais de comunicação regulares com determinados grupos, notadamente no âmbito externo. As análises apontam para a adoção de um sistema de gestão do conhecimento focado nos públicos como elemento fundamental para a eficácia dos processos comunicacionais.(AU)
Resumo:
A dissertação tem como base a importância do entendimento a respeito dos relacionamentos organizacionais para uma abordagem segmentada dos públicos na comunicação empresarial. A partir de uma reflexão teórica sobre o assunto e da observação de práticas atuais de mercado, foram estabelecidos parâmetros que contribuem para uma conceituação mais precisa dos interlocutores das corporações, no sentido de prover suas demandas informacionais. Tanto na análise das obras consultadas quanto na avaliação dos resultados da pesquisa com empresas de tradição na área de comunicação, demonstrou-se que há lacunas importantes a serem preenchidas. Entes elas, a inexistência de mecanismos que possam aferir com maior precisão as expectativas dos vários segmentos de público em relação à comunicação das empresas, em uma via de mão-dupla, bem como a falta de canais de comunicação regulares com determinados grupos, notadamente no âmbito externo. As análises apontam para a adoção de um sistema de gestão do conhecimento focado nos públicos como elemento fundamental para a eficácia dos processos comunicacionais.(AU)
Resumo:
As acceptance of the Evidence-based Psychology Practice (EBPP) model continues to grow (Pagoto, Spring, Coups, Mulvaney, Coutu, & Ozakinci, 2007), it seems pertinent to explore how this model can be applied in different settings. This topic is timely as practitioners in the field are being held ever more accountable for the efficacy of the treatments they employ (Pagoto et al., 2007). Increased scrutiny has resulted in a need to integrate research into practice in order to ensure continued relevance in the ever-changing realm of American health care (Luebbe, Radcliffe, Callands, Green & Thorn, 2007; Collins, Leffingwell & Belar, 2007; Chwalisz, 2003). This paper explores how the requirements set forth by the American Psychological Association Presidential Task Force on Evidence-Based Practice (2006) can be implemented at the University of Denver's (DU) Professional Psychology Center (PPC), a training clinic for students enrolled in the Psy.D. program at DU's Graduate School of Professional Psychology (GSPP). In doing so, the methods employed by Collins et al. (2007) at Oklahoma State University (OSU) are used as a template and modified to accommodate differences between these two institutions.
Resumo:
The phenomenon of portfolio entrepreneurship has attracted considerable scholarly attention and is particularly relevant in the family fi rm context. However, there is a lack of knowledge of the process through which portfolio entrepreneurship develops in family firms. We address this gap by analyzing four in-depth, longitudinal family firm case studies from Europe and Latin America. Using a resource-based perspective, we identify six distinct resource categories that are relevant to the portfolio entrepreneurship process. Furthermore, we reveal that their importance varies across time. Our resulting resource-based process model of portfolio entrepreneurship in family firms makes valuable contributions to both theory and practice.