246 resultados para vendors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object-oriented design and object-oriented languages support the development of independent software components such as class libraries. When using such components, versioning becomes a key issue. While various ad-hoc techniques and coding idioms have been used to provide versioning, all of these techniques have deficiencies - ambiguity, the necessity of recompilation or re-coding, or the loss of binary compatibility of programs. Components from different software vendors are versioned at different times. Maintaining compatibility between versions must be consciously engineered. New technologies such as distributed objects further complicate libraries by requiring multiple implementations of a type simultaneously in a program. This paper describes a new C++ object model called the Shared Object Model for C++ users and a new implementation model called the Object Binary Interface for C++ implementors. These techniques provide a mechanism for allowing multiple implementations of an object in a program. Early analysis of this approach has shown it to have performance broadly comparable to conventional implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulating the efficiency of business processes could reveal crucial bottlenecks for manufacturing companies and could lead to significant optimizations resulting in decreased time to market, more efficient resource utilization, and larger profit. While such business optimization software is widely utilized by larger companies, SMEs typically do not have the required expertise and resources to efficiently exploit these advantages. The aim of this work is to explore how simulation software vendors and consultancies can extend their portfolio to SMEs by providing business process optimization based on a cloud computing platform. By executing simulation runs on the cloud, software vendors and associated business consultancies can get access to large computing power and data storage capacity on demand, run large simulation scenarios on behalf of their clients, analyze simulation results, and advise their clients regarding process optimization. The solution is mutually beneficial for both vendor/consultant and the end-user SME. End-user companies will only pay for the service without requiring large upfront costs for software licenses and expensive hardware. Software vendors can extend their business towards the SME market with potentially huge benefits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53–0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42–0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is part of the field of organizational studies, focusing on organizational purchase behavior and, specifically, trust interorganizational at the purchases. This topic is current and relevant by addressing the development of good relations between buyer-supplier that increases the exchange of information, increases the length of relationship, reduces the hierarchical controls and improves performance. Furthermore, although there is a vast literature on trust, the scientific work that deal specifically at the trust interorganizational still need further research to synthesize and validate the variables that generate this phenomenon. In this sense, this investigation is to explain the antecedents of trust interorganizational by the relationship between the variable operational performance, organizational characteristics, shared values and interpersonal relationships on purchases by manufacturing industries, in order to develop a robust literature, most consensual, that includes the current sociological and economic, considering the effect of interpersonal relationships in this phenomenon. This proposal is configured in a new vision of the antecedents of interorganizational trust, described as significant quantitative from models Morgan and Hunt (1994), Doney and Cannon (1997), Zhao and Cavusgil (2006) and Nyaga, Whipple, Lynch (2011), as well as qualitative analysis of Tacconi et al. (2011). With regard to methodological aspects, the study assumes the form of a descriptive, survey type, and causal trace theoretical and empirical. As for his nature, the investigation, explicative character, has developed a quantitative approach with the use of exploratory factor analysis and structural equation modeling SEM, with the use of IBM software SPSS Amos 18.0, using the method of maximum verisimilitude, and supported by technical bootstraping. The unit of analysis was the buyer-supplier relationship, in which the object under investigation was the supplier organization in view of the purchasing company. 237 valid questionnaires were collected among key informants, using a simple random sampling developed in manufacturing industries (SIC 10-33), located in the city of Natal and in the region of Natal. The first results of descriptive analysis demonstrate the phenomenon of interorganizational trust, in which purchasing firms believe, feel secure about the supplier. This demonstration showed high levels of intensity, predominantly among the vendors that supply the company with materials that are used directly in the production process. The exploratory and confirmatory factor analysis, performed on each variable alone, generated a set of observable and unobservable variables more consistent, giving rise to a model, that needed to be further specified. This again specify model consists of trajectories was positive, with a good fit, with a composite reliability and variance extracted satisfactory, and demonstrates convergent and discriminant validity, in which the factor loadings are significant and strong explanatory power. Given the findings that reinforce the model again specify data, suggesting a high probability that this model may be more suited for the study population, the results support the explanation that interorganizational trust depends on purchases directly from interpersonal relationships, sharing value and operating performance and indirectly of personal relationships, social networks, organizational characteristics, physical and relational aspect of performance. It is concluded that this trust can be explained by a set of interactions between these three determinants, where the focus is on interpersonal relationships, with the largest path coefficient for the factor under study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The world is changing! It is volatile, uncertain, complex and ambiguous. As cliché as it may sound the evidence of such dynamism in the external environment is growing. Business-as-usual is more of the exception than the norm. Organizational change is the rule; be it to accommodate and adapt to change, or instigate and lead change. A constantly changing environment is a situation that all organizations have to live with. What makes some organizations however, able to thrive better than others? Many scholars and practitioners believe that this is due to the ability to learn. Therefore, this book on developing Learning and Development (L&D) professionals is timely as it explores and discusses trends and practices that impact organizations, the workforce and L&D professionals. Being able to learn and develop effectively is the cornerstone of motivation as it helps to address people’s need to be competent and to be autonomous (Deci & Ryan, 2002; Loon & Casimir, 2008; Ryan & Deci, 2000). L&D stimulates and empowers people to perform. Organizations that are better at learning at all levels; the individual, group and organizational level, will always have a better chance of surviving and performing. Given the new reality of a dynamic external environment and constant change, L&D professionals now play an even more important role in their organizations than ever before. However, L&D professionals themselves are not immune to the turbulent changes as their practices are also impacted. Therefore, the challenges that L&D professionals face are two-pronged. Firstly, in relation to helping and supporting their organization and its workforce in adapting to the change, whilst, secondly developing themselves effectively and efficiently so that they are able to be one-step ahead of the workforce that they are meant to help develop. These challenges are recognised by the CIPD, as they recently launched their new L&D qualification that has served as an inspiration for this book. L&D plays a crucial role at both strategic (e.g. organizational capability) and operational (e.g. delivery of training) levels. L&D professionals have moved from being reactive (e.g. following up action after performance appraisals) to being more proactive (e.g. shaping capability). L&D is increasingly viewed as a driver for organizational performance. The CIPD (2014) suggest that L&D is increasingly expected to not only take more responsibility but also accountability for building both individual and organizational knowledge and capability, and to nurture an organizational culture that prizes learning and development. This book is for L&D professionals. Nonetheless, it is also suited for those studying Human Resource Development HRD at intermediate level. The term ‘Human Resource Development’ (HRD) is more common in academia, and is largely synonymous with L&D (Stewart & Sambrook, 2012) Stewart (1998) defined HRD as ‘the practice of HRD is constituted by the deliberate, purposive and active interventions in the natural learning process. Such interventions can take many forms, most capable of categorising as education or training or development’ (p. 9). In fact, many parts of this book (e.g. Chapters 5 and 7) are appropriate for anyone who is involved in training and development. This may include a variety of individuals within the L&D community, such as line managers, professional trainers, training solutions vendors, instructional designers, external consultants and mentors (Mayo, 2004). The CIPD (2014) goes further as they argue that the role of L&D is broad and plays a significant role in Organizational Development (OD) and Talent Management (TM), as well as in Human Resource Management (HRM) in general. OD, TM, HRM and L&D are symbiotic in enabling the ‘people management function’ to provide organizations with the capabilities that they need.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The State of Iowa is conducting an as5essment of Information Technology (IT) in the Executive Branch. The purpose of this assessment is to gather data on costs, applications, systems, utilization, operations, hardware assets, administration and activities associated with the provision of IT services. To accomplish this, two leading technology vendors conducted an intense assessment. These vendors, Integrated System Solutions Corporation (ISSC), and Electronic Data Systems (EDS) analyzed extensive data provided by the various ·agencies and conducted on-site interviews during the week of November 13, 1995. Additionally, in the first week of December, the American Federation of State, County, and Municipal Employees (AFSCME) Iowa Council 61 sponsored an assessment. These assessments are included as appendices B, C, and D to this report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The steam turbines play a significant role in global power generation. Especially, research on low pressure (LP) steam turbine stages is of special importance for steam turbine man- ufactures, vendors, power plant owners and the scientific community due to their lower efficiency than the high pressure steam turbine stages. Because of condensation, the last stages of LP turbine experience irreversible thermodynamic losses, aerodynamic losses and erosion in turbine blades. Additionally, an LP steam turbine requires maintenance due to moisture generation, and therefore, it is also affecting on the turbine reliability. Therefore, the design of energy efficient LP steam turbines requires a comprehensive analysis of condensation phenomena and corresponding losses occurring in the steam tur- bine either by experiments or with numerical simulations. The aim of the present work is to apply computational fluid dynamics (CFD) to enhance the existing knowledge and understanding of condensing steam flows and loss mechanisms that occur due to the irre- versible heat and mass transfer during the condensation process in an LP steam turbine. Throughout this work, two commercial CFD codes were used to model non-equilibrium condensing steam flows. The Eulerian-Eulerian approach was utilised in which the mix- ture of vapour and liquid phases was solved by Reynolds-averaged Navier-Stokes equa- tions. The nucleation process was modelled with the classical nucleation theory, and two different droplet growth models were used to predict the droplet growth rate. The flow turbulence was solved by employing the standard k-ε and the shear stress transport k-ω turbulence models. Further, both models were modified and implemented in the CFD codes. The thermodynamic properties of vapour and liquid phases were evaluated with real gas models. In this thesis, various topics, namely the influence of real gas properties, turbulence mod- elling, unsteadiness and the blade trailing edge shape on wet-steam flows, are studied with different convergent-divergent nozzles, turbine stator cascade and 3D turbine stator-rotor stage. The simulated results of this study were evaluated and discussed together with the available experimental data in the literature. The grid independence study revealed that an adequate grid size is required to capture correct trends of condensation phenomena in LP turbine flows. The study shows that accurate real gas properties are important for the precise modelling of non-equilibrium condensing steam flows. The turbulence modelling revealed that the flow expansion and subsequently the rate of formation of liquid droplet nuclei and its growth process were affected by the turbulence modelling. The losses were rather sensitive to turbulence modelling as well. Based on the presented results, it could be observed that the correct computational prediction of wet-steam flows in the LP turbine requires the turbulence to be modelled accurately. The trailing edge shape of the LP turbine blades influenced the liquid droplet formulation, distribution and sizes, and loss generation. The study shows that the semicircular trailing edge shape predicted the smallest droplet sizes. The square trailing edge shape estimated greater losses. The analysis of steady and unsteady calculations of wet-steam flow exhibited that in unsteady simulations, the interaction of wakes in the rotor blade row affected the flow field. The flow unsteadiness influenced the nucleation and droplet growth processes due to the fluctuation in the Wilson point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This manual explains the WIC Program vendor responsibilities. Topics covered are: competitive pricing and peer groups, definitions for WIC vendors, DHEC regional map, vendor price surveys, how to become a South Carolina WIC vendor, transacting WIC checks, depositing WIC checks, vendor monitoring and administrative review procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is part of the field of organizational studies, focusing on organizational purchase behavior and, specifically, trust interorganizational at the purchases. This topic is current and relevant by addressing the development of good relations between buyer-supplier that increases the exchange of information, increases the length of relationship, reduces the hierarchical controls and improves performance. Furthermore, although there is a vast literature on trust, the scientific work that deal specifically at the trust interorganizational still need further research to synthesize and validate the variables that generate this phenomenon. In this sense, this investigation is to explain the antecedents of trust interorganizational by the relationship between the variable operational performance, organizational characteristics, shared values and interpersonal relationships on purchases by manufacturing industries, in order to develop a robust literature, most consensual, that includes the current sociological and economic, considering the effect of interpersonal relationships in this phenomenon. This proposal is configured in a new vision of the antecedents of interorganizational trust, described as significant quantitative from models Morgan and Hunt (1994), Doney and Cannon (1997), Zhao and Cavusgil (2006) and Nyaga, Whipple, Lynch (2011), as well as qualitative analysis of Tacconi et al. (2011). With regard to methodological aspects, the study assumes the form of a descriptive, survey type, and causal trace theoretical and empirical. As for his nature, the investigation, explicative character, has developed a quantitative approach with the use of exploratory factor analysis and structural equation modeling SEM, with the use of IBM software SPSS Amos 18.0, using the method of maximum verisimilitude, and supported by technical bootstraping. The unit of analysis was the buyer-supplier relationship, in which the object under investigation was the supplier organization in view of the purchasing company. 237 valid questionnaires were collected among key informants, using a simple random sampling developed in manufacturing industries (SIC 10-33), located in the city of Natal and in the region of Natal. The first results of descriptive analysis demonstrate the phenomenon of interorganizational trust, in which purchasing firms believe, feel secure about the supplier. This demonstration showed high levels of intensity, predominantly among the vendors that supply the company with materials that are used directly in the production process. The exploratory and confirmatory factor analysis, performed on each variable alone, generated a set of observable and unobservable variables more consistent, giving rise to a model, that needed to be further specified. This again specify model consists of trajectories was positive, with a good fit, with a composite reliability and variance extracted satisfactory, and demonstrates convergent and discriminant validity, in which the factor loadings are significant and strong explanatory power. Given the findings that reinforce the model again specify data, suggesting a high probability that this model may be more suited for the study population, the results support the explanation that interorganizational trust depends on purchases directly from interpersonal relationships, sharing value and operating performance and indirectly of personal relationships, social networks, organizational characteristics, physical and relational aspect of performance. It is concluded that this trust can be explained by a set of interactions between these three determinants, where the focus is on interpersonal relationships, with the largest path coefficient for the factor under study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: 1) if policies are complex, their enforcement can lead to performance decay of database servers; 2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: (1) if policies are complex, their enforcement can lead to performance decay of database servers; (2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Digital Conversion and Media Reformatting plan was written in 2012 and revised 2013-2014, as a five-year plan for the newly established department at the University of Maryland Libraries under the Digital Systems and Stewardship Division. The plan focuses on increasing digitization production, both in-house and through vendors, and creates a model for the management of this production.