Information Source: http://www.npd-solutions.com/glossary.html
ABC | see Activity Based Costing |
ABM | see Activity Based Management |
Acceptance Criteria | The criteria a product must meet to successfully complete a test phase or meet delivery requirements. |
Acceptance Testing | Formal testing conducted to enable a user, customer, or other authorized entity to determine whether to accept a product or product component. |
ACIS | A solid modeling engine or kernel used in a number of CAD systems. Having a common solids modeling engine allows more ready interchange of data between different CAD systems. |
ACIS SAT | A file format for 3D solid geometry created by systems using the ACIS solids modeling engine. |
Active Listening | A technique used to help communication among team members and project personnel. Active listening involves paying careful attention to what is being said, then rephrasing that information and feeding it back to the originator to ensure that what you think you heard is what they meant. |
Activity Based Costing | Activity Based Costing is a costing and analysis method that associates resources and their costs to activities and then associates the costs of activities to cost objects (e.g., a product) based on a cost drivers which measure use of an activity by the cost object. These cost drivers, such as the number of persons performing work or the number of setups required per product reflect the consumption of activities by the products. |
Activity Based Management | A discipline that focuses on the management of activities as a route to improving the value received by the customer and the profit received by providing this value. This discipline includes cost driver analysis, activity analysis, and performance measurement. |
Additive Fabrication | Fabrication processes which add material using a variety of processes to create a final part or item geometry with a minimum of secondary processes required. The technologies involved have previously been associated with rapid prototyping, but when these technologies are able to directly produce parts for products, additive fabrication is a better description. Other related terms include layered manufacturing, digital fabrication, and direct digital manufacturing. See Direct Digital Manufacturing. |
Adjacent Sector Innovation | A focused surveillance and innovation effort to explore how the organization’s technologies and products can be adapted and/or applied to adjacent market sectors or how concepts for products in adjacent sectors can be applied to a company’s products in its main sector. |
Advanced Product Quality Planning | Advanced Product Quality Planning (APQP) is a segment of the QS-9000 process developed by the AIAG and used by the auto industry. It provides a quality framework to understand customer needs and determine all the actions with product and process design to assure that the production process can deliver products that satisfy those needs. This framework includes tools such as quality function deployment to understand customer needs and translate them into product and process characteristics, product and process design validation and verification, failure modes and effects analysis to counter or control potential failure, and control plans to insure critical product characteristics are achieved. |
Advance Quality Planning | An assessment at the start of product development to identify problems with other similar products so that preventative steps or countermeasures can be taken with the new product. Also referred to as a like product and process review. |
Affinity Diagram | Affinity diagrams or charts are a simple way for a group to cluster qualitative data and come up with a consensus view on a subject. It is often used with QFD to sort and organize the large amount of customer needs data. In this instance, statements of customer needs are written on cards or post-its. The cards or post-its are logically organized by the group and the group develops headings under which to cluster these needs. The cards or post-its are moved to the appropriate group headings. |
AFD | See Anticipatory Failure Determination |
Affordability | The characteristic of a product with a selling price that that is no more than its functional worth to a customer and is within the customer’s ability to pay. |
AHDL | Analog Hardware Description Language (IEEE standard 1076.1) – describes the physical design, electronic behavior, logical structure and system annotation information for analog circuits. |
AHP | See Analytical Hierarchy Process |
AI | Artificial Intelligence |
AIAG | Automotive Industry Action Group consist of the Big 3 auto manufacturers who have collaborated to develop standard such as QS-9000 and APQP (see Advanced Product Quality Planning). |
AIM | Application Interpreted Model (STEP) – The model that describes the interpretation of the STEP integrated resources constructs that provide functional equivalence to the AP’s information requirements as specified in the application reference model. Required information documentation for the AIM includes the description of the entities of that information model and a summary of the rationale with which the resulting schema was derived from the application reference model. |
Allocated Requirements | Allocated requirements are requirements that apportion all or part of the performance and functionality of a higher-level requirement on a lower-level element of a system. |
Analysis of Variance | Analysis of Variance is a basic statistical technique for analyzing experimental data. It subdivides the total variation of a data set into meaningful component parts associated with specific sources of variation in order to test a hypothesis on the parameters of a model or to estimate variance components. |
ALT | Accelerated Life Testing |
Analytical Hierarchy Process | A decision making tool for complex, multi-criteria problems where both qualitative and quantitative aspects of a problem need to be incorporated. AHP clusters the decision elements according to their common characteristics into a hierarchical structure similar to a family tree. It involves building a hierarchy (Ranking) of decision elements and then making comparisons between each possible pair in each cluster (as a matrix). This gives a weighting for each element within a cluster (or level of the hierarchy) and also a consistency ratio (useful for checking the consistency of the data). By reducing complex decisions to a series of simple comparisons and rankings, then synthesizing the results, AHP helps arrive at the best decision and also provides a clear rationale for the choice made. The Analytical Hierarchy Process Model was designed by TL Saaty as a decision making aid. |
ANOVA | see Analysis of Variance |
ANSI | American National Standards Institute |
Anticipatory Failure Determination | Anticipatory Failure Determination (AFD) is a failure analysis method. Like FMEA, it has the objective of identifying and mitigating failures. Rather than asking developers to look for a cause of a failure mode, it reverses the problem by asking developers to view the failure of interest as the intended consequence and try to devise ways to assure that the failure always happens reliably. This viewpoint then facilitates better identifying steps to avoid the failure. |
AOI | Automated Optical Inspection |
AP | Application Protocols (STEP) – These specify implementable STEP data constructs for communicating information in a defined application context. It defines the context for the use of product data and specifies the use of the base standard in that context to satisfy an industrial need. AP’s are Parts in the 200 series of the STEP standard. |
API | 1. Application Protocol Interface 2. Application Programming Interface – the standard set of functions provided by a program or operating system to allow for integration of other software. Two programs linked via an API can both be altered and still work together so long as both conform to the API. |
Apportionment | The assignment of goals such as reliability from system to subsystem in such a way that the whole system will meet the required goal. |
APQP | See Advance Product Quality Planning |
AQP | See Advance Quality Plan |
Architecture | The design and interconnection of the main components of a hardware/software system. The framework and interrelationships of elements of a system. |
Architectural Principles | Architectural Principles are statements of preferred architectural direction or practice. Each principle should be stated in such a way that one will know if the architecture has the characteristics expressed by the principle. Principles need to be rationalized, stating why the principle is preferred. |
ARIZ | Russian acronym for Algorithm of Inventive Problem Solving (see Theory of Inventive Problem Solving) |
ARM | Application Reference Model (STEP) – An information model that formally describes the information requirements and constraints for an application area. The information model uses application-specific terminology and rules familiar to an expert from the application area. The model is independent of any physical implementation and must be validated by experts from the application area. |
ARO | After Receipt of Order – usually a measure of the days, weeks or months until a product can be designed and delivered. |
AS9100 | AS9100 is an international quality management standard for the aerospace industry published by the Society of Automotive Engineers; also published by other organizations worldwide, as EN9100 in Europe and JIS Q 9100 in Japan. The standard is controlled by the International Aerospace Quality Group. |
ASIC | Application Specific Integrated Circuit – a semi-custom chip used in a specific application that is design by integrating standard cells from a library. |
ASME | American Society of Mechanical Engineering |
ASQC | American Society of Quality Control |
Assignable Cause | Assignable Cause is a source of variation which is not due to chance and, therefore, can be identified and eliminated. An assignable cause is often signaled by an excessive number of data points outside a control limit and/or a non-random pattern within the control limits. Also called “special cause”. |
Associativity | A link between two different functions in a CAD system that assures that a change made in one area is reflected in all other areas. For example, a change to a solid model will be reflected in its drawing and related CAM program. Bi-directional associativity indicates that updates happen in both directions between functions. For example, a change to a drawing will be reflected in its solids model. |
ASSP | Application Specific Standard Part – a chip that is originally designed as an ASIC and is later released for general use. |
ASTM | American Society for Testing and Materials |
Asynchronous Groupware | Asynchronous Groupware is software used to help people to work in groups, but not requiring those people to be working together at the same time (asynchronous = not coordinating at a single point in time). |
ATE | see Automated Test Equipment |
ATP | Acceptance Test Procedure |
ATPG | see Automatic Test Pattern Generation |
ATS | Acceptance Test Specification |
Automatic Test Equipment / Automated Test Equipment | Automated Test Equipment (ATE) built to perform a test or sequence of tests. ATE ranges from simple devices to verify mechanical or electrical continuity to sophisticated computerized systems with automatic sequencing, data processing, and readout. ATE may be stand alone test units or may be built into the operational equipment. |
Automatic Test Pattern Generation | Automatic Test Pattern Generation is the process that utilizes lists of faults and a model of the circuit to analyze the logical and topical nature of the circuit in order to create test vectors for each fault and, thereby, produce a high-fault-coverage test pattern for a design. |
Availability | The product metric that defines the percentage of time that a product is available and operational for customer use. It is the proportion of total time that an item of equipment is capable of performing its specified functions, normally expressed as a percentage. It can be calculated by dividing the equipment available hours by the total number of hours in any given period. |
Axiomatic Design | Axiomatic Design recognizes four domains. The needs of the customer are identified in customer domain and are stated in the form of required functionality of a product in functional domain. Design parameters that satisfy the functional requirements are defined in physical domain, and, in process domain, manufacturing variables define how the product will be produced. Solution alternatives are created by mapping the requirements specified in one domain to a set of characteristic parameters in an adjacent domain. The mapping between the customer and functional domains is defined as concept design; the mapping between functional and physical domains is product design; the mapping between the physical and process domains corresponds to process design. The output of each domain evolves from abstract concepts to detailed information in a top-down or hierarchical manner. Two design axioms provide a rational basis for evaluation of proposed solution alternatives and the subsequent selection of the best alternative. The first axiom is the independent axiom, and it states that a good design maintains the independence of the functional requirements. The second axiom is the information axiom and it establishes information content as a relative measure for evaluating and comparing alternative solutions that satisfy the independence axiom. |
Balanced Scorecard | A comprehensive performance measurement technique that considers four areas of performance in a balanced way: 1) customer perspective – how customers see us, 2) internal perspective – what we must excel at, 3) innovation & learning – how we continue to improve and create value, 4) financial perspective – how we meet shareholder needs. |
Bath-Tub Curve | Bath-Tub Curve represents the failure rate of components over the life of the product. Its upward slope at the beginning and end suggests that most components fail either right away (at the beginning of the product life) or towards the end of the expected product life. |
BCL | Binary Cutter Language |
BEAR | see Break Even After Release |
Behavioral Modeling | Behavioral Modeling defines a product in terms of required behaviors rather than relationships between geometry elements for mechanical products or relationships between components, gates and registers for electronic products. For example, the use of a hardware description language (see Hardware Description Language) is a means for describing a behavioral model for an electronic product. |
Belief Map | The Belief Map is a method of graphically representing relative levels of knowledge and confidence. It is a plot whose horizontal axis represents the evaluator’s knowledge of, for example, a concept alternative’s ability to meet specified requirements. The vertical axis would represent the evaluator’s confidence in the concept’s ability to meet those requirements. |
Benchmarking | An improvement process in which a company measures the performance of its products or processes against that of best-in-class products or companies, determines how the product or company achieved their performance level, and uses the information to improve its own performance. |
Best Practice | Best Practice is a superior method or innovative practice that contributes to the improved performance of an organization, usually recognized as “best” by other peer organizations. |
BET | see Breakeven Time |
Beta Testing | Beta Testing is the testing a nearly-finished version of a piece of software or hardware, with the goal of finding defects missed by the developers. Often beta testing is carried out by people outside of the developers organization. |
Bezier curve | Polynomial used to describe complex curves and surfaces. |
BGA | Ball Grid Array – an electronic packaging technology in which solder balls are mounted to the underside of the package in a grid arrangement and are flowed for attachment to PCB’s. |
Bill of Material | A Bill of Material (BOM) is a hierarchical list of subassemblies, components and/or raw materials that make up a higher-level component, assembly, product or system. An engineering BOM represents the assembly structure implied by the parts lists on drawings and drawing tree structure. A manufacturing BOM represents the assembly build-up the way a product is manufactured. |
BIST | see Built-in-Self-Test |
BIT | Built-in Test. See Built-in-Self-Test |
Block Diagram | Block Diagram is a diagram that shows the operation, interrelationships and interdependencies of components in a system. Boxes or blocks represent the components; connecting lines between the blocks represent interfaces. |
BMP | Best Manufacturing Practices |
BOE | Basis of Estimate |
BOM | see Bill of Material |
Boundary Scan | A design for testability method that places a scan register at at every pin of every chip on a board for board testing and diagnostics. The test process can control and observe the state of every pin I/O pin without requiring physical access to any of them. |
B&P | Bid and Proposal |
BPI | Business Process Improvement |
BPR | see Business Process Reengineering |
Brainstorming | A creativity technique in which a group of people think of ideas related to a particular topic, listing as many possible ideas as possible before any critical evaluation of the ideas is performed. |
Break Even After Release | Break Even After Release – a metric that measures the time after release of a product for production or sale until the the product has achieved financial breakeven considering the investment in development and other non-recurring expenses. |
Break Even Time | Break Even Time – a metric that measures the time from the start of development through production and sales until the the product has achieved financial breakeven considering the investment in development and other non-recurring expenses. |
B-Rep | Boundary Representation – solids modeling approach based on representing exterior surfaces that define a solid (as opposed to constructive solid geometry). |
B-spline | A mathematical interpolation method for describing complex curves and surfaces |
Built-in-Self-Test | Built-in-Self-Test – a feature of automatic testing where many test pattern programs are built directly into the circuit generally for go/no-go testing of the assembly or circuit using signature analysis. |
Business Case | Business Case refers to the results of market, technical and financial analyses used to justify the feasibility of a new product. Ideally defined just prior to the “go to development” decision (gate), the case defines the product and project, including the project justification and the action or business plan. |
Business Process Reengineering | Business Process Reengineering (BPR) is the analysis and redesign of workflow within and between enterprises. Authors Michael Hammer and James Champy promoted the idea of BPR as the radical redesign and reorganization of an enterprise to lower costs and increase quality of service. They suggested seven principles of reengineering to streamline the work process and thereby achieve significant levels of improvement in quality, time management, and cost: 1) organize around outcomes, not tasks; 2)identify all the processes in an organization and prioritize them in order of redesign urgency; 3) integrate information processing work into the real work that produces the information; 4) geographically dispersed resources as though they were centralized; 5) link parallel activities in the workflow instead of just integrating their results; 6) put the decision point where the work is performed, and build control into the process; and 7) capture information once and at the source. |
CAD | see Computer-Aided Design |
CAD Framework Initiative | CAD Framework Initiative – a standard to facilitate integration of electronic design automation (EDA) tools This allows an organization to select “best of class” tools without worrying about integration issues. The CFI standards cover Design Representation Programming Interface, the Intertool Communication Programming Interface, the Tools Encapsulation Specification, the Computing Environment Services. |
CAE | see Computer-Aided Engineering |
CAGR | Compound Annual Growth Rate |
CAI | Computer-Aided Inspection |
CAIT | Computer-Aided Inspection and Test |
CAIV | see Cost as an Independent Variable (DoD initiative) |
CAM | 1. see Computer-Aided Manufacturing 2. Cost Account Manager |
Capability | Capability is a measure of the ability of a system to perform within its specification limits. It uses a series of indices: Cp, Cpk, Cr, and Cpm. |
Capability Maturity Model | Capability Maturity Model (CMM) is a model of five levels of process maturity developed by the Software Engineering Institute (SEI) a Carnegie-Mellon University for software development processes. These five levels starting at level one are: ad-hoc, repeatable, defined, managed and optimized. See Product Development Capability Maturity Model for our adaptation of the CMM to product development. |
CAPP | see Computer-Aided Process Planning |
CASE | 1. see Computer-Aided Software Engineering 2. Computer-Aided Systems Engineering |
CAT | Computer-Aided Test |
CCA | Circuit Card Assembly |
Cell | 1. An individual component of a technology library. Typically a logic gate (for example, a 2-input NAND gate). 2. Manufacturing cell is a grouping of equipment to perform the required processing for a part of assembly. |
Commercialization | Commercialization is the process to taking a new product from development to full volume sales. It includes steps such as testing and market validation, production launch and ramp-up, development of marketing programs and materials, supply chain development, sales channel development, training development, training, and service and support development. |
CCB | 1. Change Control Board 2. Configuration Control Board |
CCD | Configuration Control Drawing |
CCM | see Critical Chain Method |
CCPM | Critical Chain Project Management. see Critical Chain Method |
CDR | Critical Design Review |
CDRL | Contract Data Requirements List |
CE | 1. see Concurrent Engineering 2. Chief Engineer 3. Concept Exploration 4. Concept Engineering |
CER | Cost Estimating Relationship |
Certification | 1. A process, which may be incremental, by which a contractor provides evidence to the acquirer that a product meets contractual or otherwise specified requirements. 2. The approval by a regulatory or standards body that a product meets the applicable requirements or standards. |
CFD | see Computational Fluid Dynamics |
CFI | see CAD Framework Initiative |
CFT | see Cross-Functional Team |
Change Management | Change Management is a systematic approach to dealing with change, both from the perspective of an organization and on the individual level. Change management has at least three different aspects including: adapting to change, controlling change, and effecting change. A proactive approach to dealing with change is at the core of all three aspects. For an organization, change management means defining and implementing procedures and/or technologies to deal with changes in the business environment and to profit from changing opportunities. |
Charter | Charter is a written commitment approved by management stating the scope of authority for a development project or integrated product team. |
Check-In | The process of placing or returning a new or modified product information under control within a PDM/PIM system. If a revision is being created, this procedure usually initiates a review/approval process under control of the PDM/PIM system. |
Check-Out | The process of accessing managed product definition information under controlled procedures. Access may be for viewing, reference, for use in another application or task, or for making a change to the information. The PDM/PIM system prevents multiple, simultaneous change activities to ensure product information integrity. |
Chip-on-Board | Chip-on-Board is a component packaging technology in which bare integrated circuits are attached directly to the substrate and interconnected by means of microscopic wires. |
CI | 1. Continuous Improvement 2. see Configuration Item |
CIM | Computer-Integrated Manufacturing |
CIME | Computer-Integrated Manufacturing and Engineering |
CITIS | Contractor Integrated Technical Information Services (CALS initiative) |
Classification | Classification is the assignment of attributes and other defining meta-data to managed objects and information within a PDM system. This meta-data are then used for finding data with similar characteristics. |
Clinical Trial | Clinical Trial is testing a system in a clinical setting; that is, in a hospital, clinic, doctor’s office, etc. User testing and feature testing in such an environment has special limitations, especially because of the potential for unexpected effects on patient care. User testing in clinical settings will often require review by an ethics committee to ensure that patient privacy is not compromised and that no harm will come to patients as a result of testing. As with drug testing, it may not be appropriate to remove a feature from a system for the sake of testing if it appears that the feature is directly benefiting patient care. |
Cloud of Points | A set of x-y-z coordinates obtained from a 3D scanner or digitizer. The data can be interpreted as a continuous surface and used in a 3D model. This is often used for reverse engineering. |
CM | see Configuration Management |
CMM | 1. see Coordinate Measuring Machine 2. see Capability Maturity Model |
CNC | Computer Numerical Control |
CND | Cannot Duplicate (failures). Also known as “No Trouble Found” (NTF) |
COB | see Chip-on-Board |
Cognitive Modeling | Cognitive Modeling produces a computational model for how people perform tasks and solve problems, based on psychological principles. These models may be outlines of tasks written on paper or computer programs which enable us to predict the time it takes for people to perform tasks, the kinds of errors they make, the decisions they make, or what buttons and menu items they choose. Such models can be used to determine ways of improving the user interface so that a person’s task has fewer errors or takes less time and to build into the user interface to make software that reacts more effectively to help people use the system by anticipating their behavior. |
Cognitive Walkthroughs | Cognitive walkthroughs involve the development of task scenarios from a product specification. Experts then role-play the part of a user working through a set of tasks. Each step of the user’s process is evaluated for adherence to established usability principles. |
Collaboration | 1. Working together, cooperating 2. A process of maximizing both cooperative and assertive behavior to satisfy two parties in conflict with one another. |
Collaborative Product Commerce | The Aberdeen Group defines Collaborative Product Commerce (CPC) as “…a class of software and services that uses Internet technologies to permit individuals – no matter what role they have in the commercialization of a product, no matter what computer-based tools they use, no matter where they are located geographically or within the supply net – to collaboratively develop, build, and manage products throughout the entire lifecycle. Using a standard browser, an authorized CPC user can review information from an extended-enterprise information system ‘view’ that operates across a dispersed set of heterogeneous product development resources. These resources typically reside in multiple information repositories and are derived from independently implemented and maintained systems.” |
Collocation | The practice of physically locating multi-function integrated product team members in proximity to one another to enhance communication, coordination and decision-making on a development project. Virtual collocation refers to the use of technology to achieve some of the communication benefits for team members that are geographically dispersed. |
Common Cause | Common Cause is a variation that is inherent in the process and cannot be readily identified and controlled. |
Competitive Intelligence | Methods and activities for transforming disaggregated public competitor information into relevant and strategic knowledge about competitors’ position, size, efforts and trends. The term refers to the broad practice of collecting, analyzing, and communicating the best available information on competitive trends occurring outside one’s own company. |
Component Engineering | The application of engineering know-how to the processes of component selection, application, process compatibility and procurement, including analysis of new trends in electronic devices. |
Component Supplier Management | Component Supplier Management (CSM) is a class of software applications that maintain information about standard components, both purchased and made, to support various functional disciplines such as design, procurement, materials, configuration management, and manufacturing. This system serves as a central repository for component and supplier information to maximize standardization, design retrieval and re-use, and procurement efficiency. CSM systems contain four major elements, part classification and retrieval, component libraries, Web component cataloging, and component/supplier process management. |
Computational Fluid Dynamics | Computational Fluid Dynamics is the numerical analysis of fluid and gas flow, heat transfer, and related phenomena. CFD solvers contain a complex set of algorithms used for modeling and simulating the flow of fluids, gases, heat, and electric currents. |
Computer-Aided Design | Computer-Aided Design (CAD) is the use of a computer to assist in the creation and modification of a design, most commonly, designs with a heavy engineering content. |
Computer-Aided Engineering | Computer-Aided Engineering (CAE) is the use of computers in design, analysis, and manufacturing of a product, process, or project. Sometimes refers more narrowly to the use of computers only in the analysis stage. |
Computer-Aided Manufacturing | Computer-Aided Manufacturing (CAM) is the use of the computer description of the part or assembly to drive planning, cutting, forming, assembly and inspection of the item via computerized applications. |
Computer-Aided Process Planning | Computer-Aided Process Planning uses part data and process rules to generate process plans or work instructions. Variant CAPP is based on group technology classification of parts and part features to search for a predetermined similar process plan that most closely matches the classification. Generative CAPP uses part and feature classification along with rules and knowledge about manufacturing processes associated with features to generate an appropriate process plan. |
Computer-Aided Software Engineering | Computer-Aided Software Engineering (CASE) is the application of computer technology to facilitate the development of software. CASE tools usually include libraries of reusable code (modules of software that can be easily modified for specific tasks), programmer productivity tools, application generators, and testing utilities. CASE tools also provide requirement management, structured system design and analysis, system simulation, test management, documentation generation, etc. |
Computer Software Configuration Item | Computer Software Configuration Item (CSCI) is a software component of a system, which is designated for configuration management to ensure configuration integrity. It may exist at any level in the hierarchy where interchangeability is required. Each CSCI is to have (as appropriate) individual design reviews, individual qualification/certification, individual acceptance reviews, and separate user manuals. |
Concept | An idea for a new product or system that is represented in the form of a written description, a sketch, block diagram or simple model. A concept is the earliest representation of a new product or of alternative approaches to designing a new product. |
Concept Model | A physical model or representation intended primarily for design review, product conceptualization and customer feedback. This model is usually not sufficiently accurate or durable for full functional and physical testing. |
Concept Testing | The process by which a concept statement, sketch or model is presented to customers for their reactions. These reactions can either be used to permit the developer to estimate the sales value of the concept or to make changes to the concept to enhance its potential sales value. |
Conceptual Architecture | The Conceptual Architecture represents an appropriate decomposition of the system without delving into the details of interface specification. The conceptual architecture identifies the system components or subsystems, the responsibilities of each component or subsystem, and interconnections between components or subsystems. |
Concurrency | The degree to which phases, stages, or activities may be overlapped or done in parallel. |
Concurrent Engineering | A systematic approach to the integrated, concurrent design of products and their related processes, including manufacture and support. This approach is intended to cause the developers, from the outset, to consider all elements of the product life cycle from conception through disposal, including quality, cost, schedule and user requirements. |
Configuration | A collection of an item’s descriptive and governing characteristics, which can be expressed in a) functional terms, i.e. what performance the item is expected to achieve, and b) physical terms, i.e. what the item should look like and consist of when it is completed. |
Configuration Item | Configuration Item (CI) is a hardware, software, or composite item that has a defined function, can be at any level in the system hierarchy, and is designated for configuration management. |
Configuration Management | Configuration Management (CM) is the process of managing a product’s requirements and design documentation as it evolves and changes over its lifecycle (from requirements definition through production, operation, support and disposal) and assuring that the resulting products and processes conform to this documentation. Configuration Management function’s include maintaining the configuration status of a document, product and process; reporting on this configuration; controlling changes to this configuration; (see Engineering Change Control), and verifying that the resulting configuration of the product or process corresponds with that intended in its underlying documentation. |
Conformance Testing |
The testing of a candidate product for the existence of specific characteristics required by a standard in order to determine the extent to which that product is a conforming implementation. |
Conjoint Analysis | A methodology for exploring and describing subjective customer views of product features. Conjoint analysis avoids direct questioning, e.g., “what do you think of the price of our product?” Instead, the customer is asked what they are willing to pay for a particular product feature. Thus, the real buying situation with consideration of different cost-benefit alternatives is simulated. The resulting analysis show directly the contribution of each product feature to the total product utility. Conjoint analysis can be used to determine to what extent a product’s perceived utility changes if some particular product feature is modified. |
Consensus | Consensus is a group decision resulting from members engaging in full and open discussion and then reaching agreement to live with and openly support the resulting decision. |
Constraints | 1. As related to CAD, these are values in a geometric model that define relationships between entities such as planes, surfaces, points, lines, arcs, centers, edges, etc. Constraints are used to fully define a model and to drive parametric or variational geometry systems. The algorithms used to work with constraints are known as constraint management 2. Restrictions or boundaries impacting overall capability, priority, and resources. |
Contextual Analysis / Contextual Inquiry | Contextual Analysis / Contextual Inquiry is a structured field evaluation method which uses a combination of methodologies derived from anthropology and journalism. By observing and interviewing users of products in their actual environment and understanding the context in which a product is used, better insight is gained into the issues that affect contextual analysis / contextual inquiry is a discovery process that can add insight into the needs of customers. |
Contingency | A Contingency is the planned allotment of time, cost, budget or design margin for unforeseeable elements or risks with a development project. |
Contingency Design | Contingency Design is a form of mistake-proofing focusing on the user’s experience with the product. The intent is to design in features that help the user avoid mistakes or allow the users to quickly correct input of data or operation of the product. This is accomplished through layout and graphic design, intuitive operation, clear instructions, appropriate markings and warnings, descriptive error messages, avoidance of technical jargon, and simple operation steps. |
Control Chart | A graphical display of results of a process over time. They are used to determine if a process is in statistical control or in need of adjustment. |
Control Limits | In statistical process control (SPC), two horizontal lines are drawn on a control chart denoting the upper control limit (UCL) and the lower control limit (LCL). The sample-means and the ranges from a production lot must be within these limits. If they are so, the process is behaving normally and is said to be under control. If any point lies outside either of the limits, this denotes loss of control – the process must be halted and the reason found. |
Control Plans | Control Plans are written descriptions of the systems for controlling parts, assemblies, products, and processes. They are written to address the important characteristics and engineering requirements of the product. Each part or assembly should have a Control Plan, but in many cases, “family” Control Plans can cover a number of parts produced using a common process. |
Cooperative Design | see Participatory Design |
Coordinate Measuring Machine | Coordinate Measuring Machine (CMM) is a device that dimensionally measures 3-D products, tools and components with an accuracy approaching 0.0001 in. It used for both inspection and reverse engineering. |
COQ | see Cost of Quality |
Core Competencies | Core Competencies are the essential capabilities that create a firm’s sustainable competitive advantage. |
Corrective Action | Corrective Action is an action taken to eliminate the causes of an existing nonconformity or other undesirable situation in order to prevent recurrence. |
Cost as an Independent Variable | Cost as an Independent Variable (DoD initiative) – an acquisition strategy of obtaining the best available product/system within the constraints of available resources. Cost performance and schedule trades are made to achieve this balance with budget. |
Cost Benefit Ratio | The ratio of the present value of benefits to the present value of costs. |
Cost Drivers | 1. Those elements of cost which significantly impact the product/system’s cost. 2. Any factor that causes a change in the cost on an activity. An activity may have multiple cost drivers associated with it. |
Cost Estimating Relationship | Cost Estimating Relationship is an equation that defines the relationship of an independent variable or product parameter (e.g., product weight, speed, etc.) to its related cost or price. Cost estimating relationships are the basis of parametric cost estimating techniques. |
Cost Model | A Cost Model is an estimating tool consisting of one or more cost estimating relationships, estimating methodologies, or estimating techniques used to predict the cost of a system or one of its lower level elements. |
Cost of Quality | All costs expended for appraisal costs, prevention costs, and both internal and external failure costs of activities and cost objects. |
Cost Reduction | A formal activity employed to rectify a cost target breach or to reduce the cost of an existing product or design. A cost reduction effort has a specific quantified objective and may affect schedule, performance or support to achieve this objective. |
Cost Table | A Cost Table is a multidimensional data base in which cost is captured for several levels of a number of attributes for either the parts or functions of a product. Cost tables are used to develop early estimates of the cost of a design based on product or part parameters or functions and different materials and manufacturing processes and methods. Cost tables have been primarily used by Japanese companies. |
COTS | Commercial Off-The-Shelf |
Cp | Cp is a capability index that tells how well a system can meet two-sided specification limits, assuming that the average is centered on the target value. Cp is the ratio of the specification range to the process capability at plus or minus 3 sigma. |
CPC | see Collaborative Product Commerce |
CPD | Concurrent Product Development (Synonymous with concurrent engineering or integrated product development. See Integrated Product Development) |
CPLD | Complex Programmable Logic Device – contains more than 1,000 gates and 44 or more pins. |
CPI | Continuous Process Improvement |
Cpk | Cpk is a capability index for a non-centered mean that tells how well a system can meet two-sided specification limits. Cpk is the ratio of the specification range to the process capability at plus or minus 3 sigma. |
CPM | Critical Path Method – A method for determining the minimum project duration by identifying the critical path based on task interrelationships and duration. It assumes there is no wasted time for the activities that are on the critical path. |
Crashing | Taking action to decrease the total project duration by analyzing a number of alternatives to determine how to get the maximum duration compression for the least cost. Often, it involves reducing the time it takes to complete an activity by adding resources. |
Creeping Elegance / Featurism | The tendency for designers to add more capability, functions and features to a product as it is being developed than were originally intended. These actions cause a product’s cost to increase beyond the target, the schedule to slip and can detract from usability. |
Critical Chain Method | Critical Chain Method is a project scheduling and management methodology developed by Eliyahu Goldratt based on concepts from the Theory of Constraints. With Critical Chain scheduling, uncertainty is primarily managed by (a) using average task duration estimates; (b) scheduling backwards from the date a project is needed (to ensure work that needs to be done is done, and it is done only when needed); (c) placing aggregate buffers in the project plan to protect the entire project and the key tasks; and (d) using buffer management to control the plan. |
Critical Characteristics | The characteristics or specifications for a material, part, assembly or product that define those attributes that are essential to the proper fit or functioning of the item to satisfy the intended customer use or need. |
Critical Path | In a project network diagram, the critical path is the one with the longest duration. The critical path may change from time to time as activities are completed ahead of or behind schedule. (see CPM) |
Critical-to-Function (CTF) | A subset of drawing/model parameters that are critical to function and have tolerances and/or datums different from the standard tolerances or datum. As a result, these parameters will usually have tolerances and datums specifically defined on a drawing or in a model. In the absence of dimensional drawings, CTF dimensions are a means of communicating dimensions critical to success of the design, tolerance and other non-geometrical information. This approach is generally simpler than a complete fabrication drawing because of fewer dimensions. |
Critical to Quality | Critical to Quality (CTQ) characteristics are the key measurable characteristics of a product or process whose performance standards or specification limits must be met in order to satisfy the customer. They align improvement or design efforts with customer requirements. Also see critical characteristics. |
Cross-Functional Team | Cross-Functional Team is a team consisting of representatives from marketing, engineering, manufacturing, finance. purchasing, test, quality, finance and any other required disciplines with responsibility for developing a product or product subsystem. This team is empowered to represent the functional disciplines and develop a product by addressing its life cycle requirements including its product and support. |
CSCI | see Computer Software Configuration Item |
C/SCS | Cost/Schedule Control System is a performance measurement system that uses earned value techniques to breakdown a budget to cost variation into cost and schedule variation components |
CSG | Constructive Solids Geometry – a solid modeling method using primitives to build more complex models and Boolean operations of add, difference, and intersection. |
CSM | see Component Supplier Management |
CTF | see Critical-to-Function |
Cumulative Tolerance | Progressive accumulation of tolerances resulting from multiple operations or assembly of multiple parts. |
Customer Need | A fundamental need to be satisfied independent of a particular technology or product solution (e.g., access the internet.) |
Data Dictionary | Data Dictionary – a definition of data elements for uses such as information engineering or quality function deployment. |
Data Flow Diagram | Data Flow Diagram a structured system design representation of processes and the data flows that connect the processes. |
Data Interchange | Data Interchange refers to the ability to exchange and use product data between various types of CAD/CAM/CAE systems. Data interchange can be accomplished by (listed in order of maximum interchange of product model intent): use of the same CAD system, use of the same CAD kernel (e.g., ACIS, parasolid, etc.), dedicated translator between two CAD systems, and use of neutral file formats (e.g., STEP, IGES, etc.). |
Datum | Theoretically exact planes, lines or points from which other features are located on design drawings. |
DBF | Design by Features |
DBT | Design Build Team – term used by Boeing and others synonymous with integrated product team. A multi-function or cross-functional team with responsibilities for requirements definition, product and process design, and production launch of a new product. |
DCF | see Discounted Cash Flow |
Decomposition | Decomposition is the process of dividing the system into its smallest, coherent, self-contained elements. Decomposition is used in systems engineering, software engineering, process mapping and functional analysis system technique. |
Defects Per Million Opportunities | Defects Per Million Opportunities (DPMO) is a quality measurement where defects are defined as material or process or related, and opportunities for defects are defined as the sum of all parts, lead attachments and/or glue dots as applicable for each part number assembled |
Defect Tracking | Defect Tracking typically refers to identifying flaws appearing during manufacturing. The goal is not only to spot defects but to track them and identify the source of the problem to prevent them. Were specs incomplete, impractical, or poorly specified? Was the design difficult to produce? Were needed process capabilities lacking? |
Derived Requirements | Requirements that are not explicitly stated in the customer requirements, but are inferred a) from contextual requirements (e.g., applicable standards, laws, policies, common practices, and management decisions), or b) from requirements needed to specify a product component. Derived requirements can also arise during analysis and design of components of the product or system. |
Design Failure Modes and Effects Analysis | Design Failure Modes and Effects Analysis (DFMEA) – a form of FMEA associated with the product design (see Failure Modes and Effects Analysis). |
Design for Assembly | Design for Assembly (DFA) refers to the principles of designing assemblies so that they are more manufacturable. DFA principles address general part size and geometry for handling and orientation, features to facilitate insertion, assembly orientation for part insertion and fastening, fastening principles, etc. The objective of DFA is to reduce manufacturing effort and cost related to assembly processes. |
Design for Disassembly | Design for Disassembly (DFD) is a set of principles used to guide designers in designing products that are easy to disassemble for recycling, remanufacturing, or servicing. |
Design for Environment | Design for Environment is a process for the systematic consideration during design of issues associated with environmental safety and health over the entire product life cycle. DFE can be thought of as the migration of traditional pollution prevention concepts upstream into the development phase of products before production and use. |
Design for Manufacturability | 1. Design for Manufacturability (broad definition) is a methodology for designing product’s in a way that facilitates the fabrication of the product’s components and their assembly into the overall product. In this respect it is synonymous with Design for Manufacturability / Assembly. 2. Design for Manufacturability (narrow definition) is a methodology for designing product’s components in a way that facilitates their fabrication. |
Design for Manufacturability / Assembly | Design for Manufacturability / Assembly (DFM/A) is the broad definition of optimizing a product’s design for make it’s parts more manufacturable (fabrication) and easier to assemble. DFM/A includes: understanding the organization’s process capabilities, obtaining early manufacturing involvement, using formalized DFM/A guidelines, using DFM/A analysis tools, and addressing DFM/A as part of formal design reviews. |
Design for Postponement | With a product that offers a lot of different configurations and options, an objective is to delay of postpone the assembly of the unique parts or subassemblies into the product until as late as possible in the final assembly process. This reduces production leadtime (products can be built and stocked up to this point), facilitates a strategy of mass customization, and provides supply chain flexibility. Design for postponement is the design of an assembly in a way that allows the customizable parts of the product to be assembled as late as possible in the assembly process. |
Design For Reliability | Design For Reliability (DFR) is methodology and set of principles to enhance product reliability and reduce overall low life-cycle costs. It is based on early involvement of reliability engineering working with design engineering to enhance reliability by performing steps such as the following: reliability program planning, reliability predictions, parts derating, thermal analysis, failure modes and effects analysis (FMEA), fault tree analysis (FTA), availability and system modeling, HALT/HASS, design verification testing, product return rate analysis, FRACAS, and root cause failure analysis. |
Design for Serviceability | Design for Serviceability (DFS) is a set of principles and a methodology for analyzing product concepts or designs for characteristics and design features which reduce service requirements and frequency, facilitate diagnosis, and minimize the time and effort to disassemble, repair/replace, and reassemble the product as part of the service process. |
Design for Six Sigma | Design for Six Sigma (DFSS) is a systematic methodology or quality framework utilizing tools, processes and measurements to enable the design of products and processes that meet customer expectations and can be produced at Six Sigma quality levels. DFSS is built around five connected phases of Define, Measure, Analyze, Design, Verify or DMADV. The tools and methods to support DFSS include voice of the customer (VOC), quality function deployment (QFD), design for manufacturability and assembly, design of experiments (DOE), failure mode and effects analysis (FMEA), process capability studies, design reviews, control plans, etc. |
Design Intent | The intended form, fit, function, and characteristics of a product or its constituent parts. |
Design of Experiments | A statistical methodology for designing, conducting and analyzing experiments or tests to evaluate product and process design parameters or factors that affect the achievement of a product performance characteristic. The response of interest is evaluated under the various conditions to: (1) identify the influential variables among the ones tested, (2) quantify the effects across the range represented by the levels of the variables, (3) gain a better understanding of the nature of the causal system at work in the process, and (4) compare the effects and interactions. These experiments lead to setting parameter or factor levels (values) that can optimize the product performance characteristic under study and minimize the affects of variation. There are several techniques including Taguchi Methods, fractional factorial, and Plackett-Burman. DOE’s are often classified in one of three categories: Screening Designs, which are intended to identify which main effects (factors) are the vital few important factors that require further study (usually a fractional factorial); Characterization Designs, which are used to gain some quantitative understanding of the relationships among the factors, including interactions, on the response variable (usually a factorial); and Optimization Designs, which are used to gain a precise understanding of the mathematical relationships that is sufficient to allow prediction and optimization throughput the experimental region. |
Design Optimization | 1. Design Optimization in the broad sense refers to optimizing a design to meet its functional, environmental and lifecycle requirements at a minimum of cost. 2. Design Optimization in a narrower sense refers to the use of computer-aided engineering applications which analyze a design and, given constraints and objectives, seek to improve or optimize the design to meet the stated objectives within the stated constraints. These applications will typically use an iterative, goal-seeking cycle to seek design optimization. |
Design Reviews | Design reviews are formal technical reviews conducted during the development of a product to assure that the requirements, concept, product or process satisfies the requirements of that stage of development, the design is sound, the issues are understood, the risks are being managed, any problems are identified, and needed solutions proposed. Typical design reviews include: requirements review, concept/preliminary design review, final design review, and a production readiness/launch review. |
Design Structure Matrix | Design Structure Matrix – matrix used to represent and analyze task dependencies in a product development project / process. |
Design-to-Cost | A development methodology that treats cost as an independent design parameter. A realistic cost objective is established based on customer affordability, tradeoff’s are made between the cost objective and other product functions/parameters, cost models are used to project the cost early in the development cycle, and a variety of techniques such as function analysis and DFM are used to proactively achieve the cost objective. |
Design to Life-Cycle Cost | This represents the totality of design-to-cost addressing all costs related to acquisition, operation, support and disposal. |
Design to Unit Production Costs | The average unit production costs for producing the specified hardware lot(s) which generally includes recurring material, labor and overhead costs; engineering change costs; program management costs; and production support costs. |
Design Validation | Testing to assure that the product conforms to defined user needs and requirements. This normally occurs toward the end of the Design Phase following successful design verification and prior to pilot production, beta/market testing, and product launch. Design validation is normally performed on the final product under defined, operating conditions. Multiple validations may be performed if there are different intended uses. See Validation. |
Design Verification | Design verification is the process of ensuring the design conforms to specification (design outputs meet design input requirements). Design verification may include: alternate calculations, design reviews, comparison to similar designs, inspection, and system or product testing. |
Detailed Design | The conversion of product specifications into designs and their associated process and/or code-to documentation. Detailed design includes design capture, modeling, analysis, developmental testing, documentation, process design, producibility analysis, test plan development, coding, and design verification and validation. |
Deterministic | An approach that presumes the presence of fixed constraints. |
DFA | see Design for Assembly |
DFAA | Design for Automated Assembly |
DFD | 1. see Design for Disassembly 2. see Data Flow Diagram |
DFE | see Design for the Environment |
DFM | see Design for Manufacturability |
DFM/A | see Design for Manufacturability / Assembly |
DFMEA | see Design Failure Modes and Effects Analysis |
DFMt | Design for Maintainability |
DFSS | see Design for Six Sigma |
DFT | Design for Test |
DFX | Design for Excellence – designing to consider all relevant life cycle factors such as manufacturability, reliability, maintainability, testability, affordability, etc. |
Diagnosability | The ability to uniquely identify any faults (or potential faults) in the behavior or operation of the product. Diagnosability would indicate not only what the fault was, but also what failed or caused the failure (e.g., module, component, line of code, etc.). |
Digital Mock-up | Solids modeling capabilities that enable complete products to be built in electronic form. The mockups can be used to check for problems such as interference and clashes between components. Using digital mockups reduces the cost and time of development since physical models do not need to be built. Synonymous with digital pre-assembly, electronic mock-up, and assembly modeling. |
Direct Costs | Cost that can be specifically identified or traced to an activity, cost object or final cost objective. |
Direct Digital Manufacturing | The process of going directly from an electronic digital representation of a part or item to the final part or item via additive fabrication. See Additive Fabrication. |
Directed Evolution | Directed Evolution is an advanced TRIZ methodology used to create scenarios to support the planning and development of future generations of technical systems. |
DIS | Draft International Standards (International Standards Organization) |
Discontinuous Innovation | Discontinuous innovation falls outside of existing markets or market segments, and when successful extends and redefines the market, exposing new possibilities. Discontinuous innovation is characterized by lateral or divergent thinking, by looking outside of defined boundaries, and by discovery of new knowledge related to both market need and technological capability. |
Discounted Cash Flow | Discounted Cash Flow – an analysis technique that determines the present value of a series of positive and negative cash flows using a specified discount factor representing the cost of capital. This can be used to compare investment alternatives such a new product development alternatives. |
DMAIC | Define, Measure, Analyze, Improve and Control – a Six Sigma improvement methodology |
DMADV | DMADV is a data driven quality strategy for designing products and processes that is an integral part of a Six Sigma quality initiative. It consists of five interconnected phases: define, measure, analyze, design and verify. |
DMIS | Dimensional Measurement Interface Specification (ANSI standard) |
DNC | Distributed Numerical Control |
DOA | Dead on Arrival. Products that don’t operate when received and first used by a customer. |
DoD | Department of Defense |
DOE | 1. see Design of Experiments 2. Department of Energy |
DPA | Digital Pre-Assembly – a term for electronic mock-up performed with CAD solids modeling. |
DPM | Defects per Million |
DPMO | see Defects Per Million Opportunities |
DRM | 1. Drawing Requirements Manual 2. Drafting Room Manual 3. DRM Associates (product development consulting and training firm) |
DSM | 1. Deep Sub-Micron design – relates to the design of integrated circuits with feature sizes less than .5um. 2. see Design Structure Matrix |
DSS | Decision Support System |
DTC | Design to Cost is a development methodology that treats cost as a design parameter. A realistic cost objective is established based on customer affordability, cost models are used to project the cost early in the development cycle, and a variety of techniques such as value analysis and DFM are used to proactively achieve the cost objective. |
DTLCC | Design to Life Cycle Cost |
Durability | The probability that an item will continue to function at customer expectation levels at the useful life without requiring overhaul or rebuild due to wear-out. |
DTUPC | see Design to Unit Production Cost |
Durability | The ability of a product and any of its components to perform the required functions in its intended service environment over its intended service life without unforeseen cost of maintenance and repair. |
DUT | Device Under Test |
DXF | Data Exchange Format – format for CAD drawings often used to transfer CAD data from one system or program to another. |
Dynamic Analysis | Dynamic Analysis is the analysis of mechanism’s motions that result from forces. Dynamic simulation is more complex than Kinematic Analysis because the problem needs to be further defined and more data is needed to account for the forces. But Dynamic Analysis is often required to accurately simulate the actual motion of a mechanical system. Generally, Kinematic Analysis helps evaluate form, while Dynamic Analysis assists in analyzing function. (Also see Kinematic Analysis.) |
EAC | Estimate at Completion |
Early Adopter | Early Adopter is a person or organization who chooses to purchase or use relatively new technology before it is fully embraced by the mass market. Early adopters are therefore people or organizations who have a stronger need for the technology, a lower reluctance to use it, or the ability to overcome barriers to adopting it. |
Early Supplier Involvement | Early Supplier Involvement is the process of getting the supplier involved early in the development process (when an item is being conceptualized, designed or specified) so that the supplier can make proactive suggestions to improve the design and reduce its cost vs. providing reactive feedback once the design has been completed. |
EC | 1. Engineering Change 2. Electronic Commerce 3. European Community |
ECAD | Electrical/Electronic Computer-Aided Design |
ECAE | Electrical/Electronic Computer-Aided Engineering |
ECCB | Electronic Component Certification Board |
ECN/ECO | see Engineering Change Notice / Engineering Change Order |
ECP | Engineering Change Proposal |
EDA | see Electronic Design Automation |
EDB | Electronic Data Book |
EDI | Electronic Data Interchange (ANSI-X.12) – EDI is the exchange, between organizational entities, of computer processable data in a standard format. The 841 transaction is used to transfer technical data. |
EDIA | Electronic Data Interchange Association |
EDIF | see Electronic Design Interchange Format |
EDM | 1. Engineering Data Management 2. Engineering Document Management 3. Electronic Document Management 4. Electrical Discharge Machining |
EDMS | 1. Electronic Document Management System 2. Engineering Data Management System |
EEPROM | Erasable Programmable Read Only Memory |
Effectivity | An indicator in a product structure which specifies the versions at which a component part is used. These indicators generally specify a range of either dates, serial numbers, or build lots. Effectivity indicators are typically considered as ‘conditions’ on the parent-child relationships in a product structure. |
Effectivity Date | Effectivity Date is the date from which an intended engineering change is to come into effect (new product configuration) or the past configuration of the product is to go out of effect. |
EIA | Electronic Industries Association |
EIS | 1. Engineering Information System 2. Executive Information System |
Electronic Design Automation | Electronic Design Automation (EDA) consists of hardware and software tools to aid in the design and development of electronic products through design capture, simulation, synthesis, verification, analysis, and testing. |
Electronic Design Interchange Format | An EIA/ANSI standard which defines the file format for communicating two-dimensional graphics and interconnection information that is used to describe the patterns for fabricating and manufacturing semiconductors and PCB/PWB’s. |
Electronic Manufacturing Services | Electronic Manufacturing Services (EMS) refers to the industry that provides contract design, manufacturing, and related product support services on behalf of electronics OEMs, in which the design and brand name belongs to the OEM making electronic products or subassemblies to be sold under the OEM brand name. Often referred to as “Contract Manufacturing” or “Contract Electronics Manufacturing”. |
Electronic Systems Design Automation | Electronic Systems Design Automation (ESDA) is a set of graphical front-end tools that allow designers to use pictures rather than words to describe and analyze their creations. These tools can use HDL’s as an interchange format rather than a design medium and allow for higher degrees of abstraction over traditional schematic capture or waveform display programs. |
Electronic Systems Level | Electronic Systems Level is a higher level abstraction for the design of electronic products than RTL (see Register Transfer Level) which will improve design productivity with the design of ever larger and more complex electronic systems. This is the third generation in design methodologies and tools (gate-level, register transfer level, and electronic systems level). Key elements of ESL include behavioral synthesis, integration between the behavioral level and the architecture level, and hardware/software codesign and coverification. |
EMI | 1. Early Manufacturing Involvement 2. Electro-Magnetic Interference |
Empathic Design | Empathic Design is based on observation — watching customers/consumers use products or services. But unlike focus groups, usability laboratories, and other contexts of traditional market research, this observation is conducted in the customer’s own environment in the course of normal, everyday routines. This approach enables the researcher to observe and develop information on customer needs that will drive design that is not accessible through other observation-oriented research methods. |
EMS | see Electronic Manufacturing Services |
Emulation | The process by which a device under development and its native software is prototyped before its manufacture. |
End-of-Life | End-of-Life (EOL) is the term applied to products or components that are being retired from the market because of technology obsolescence or rapidly declining demand. |
Engineering Change |
A modification to a component, product configuration, or document from currently defined and approved status. Changes cause version or revision levels of affected items to be updated. |
Engineering Change Control | Engineering Change Control is the process and procedures that manage how changes are proposed, reviewed, and approved and incorporated into a product and its associated data items. Change control is a part of an overall configuration management methodology and uses review and release processes to enforce compliance with company change policies. |
Engineering Change Notice / Engineering Change Order | Engineering Change Notice (ECN) / Engineering Change Order (ECO) are formal documents notifying selected persons of proposed, pending, or accomplished changes. In a PDM/PIM-managed environment, ECNs may be distributed by electronic mail. |
Enhanced Quality Function Deployment | Enhanced Quality Function Deployment is a broader QFD framework that applies a system perspective recognizing the need to decompose more complex products into subsystems and assemblies with supporting deployment matrices and concept selection matrices. |
Enterprise Resource Planning | Enterprise Resource Planning (ERP) is an integrated computer applications to plan and support execution of business functions in the manufacturing enterprise. ERP relates to product development in the following ways. ERP applications will contain product structure data (bills of material) generated during development. Some ERP applications also provide some product data management functionality. Finally, ERP is the tool to help forecast new product demand and order and stock materials to support product launch. |
Environmental Stress Screening | Environmental Stress Screening (ESS) is a process which applies specific kinds of environmental stresses to products on an accelerated basis, but within their design parameters and limits to cause latent and intermittent flaws to become detectable failures. Also see Highly Accelerated Stress Screening (HASS). |
EPD | Electronic Product Definition |
EPL | Engineering Parts List |
EPLD | Erasable Programmable Logic Device |
EQFD | see Enhanced Quality Function Deployment |
Ergonomics | Ergonomics is the science of designing products and work to be consistent with the capabilities and limitations of the human body. |
ERP | see Enterprise Resource Planning |
ESDA | see Electronic Systems Design Automation |
ESL | see Electronic Systems Level |
ESS | see Environmental Stress Screening |
EST | Environmental Stress Testing |
ETC | Estimate to Complete |
Ethnographic Studies | A qualitative method of researching customer needs based on studying the anthropology or culture of the user. This method involves spending time in the field observing customers and their environment to better understand their lifestyle or culture as a basis for understanding their needs for a new product. A deep understanding of your customer can lead to fundamental insights that impact product design, feature sets, product positioning, marketing communications, advertising execution, etc. |
EWI | Electronic Work Instruction |
Expectancy Theory | The view that our effort will be greatest when we expect that we can perform the task at hand and that we expect to obtain rewards for our performance. |
Experience Curve | Experience Curve (also known as a learning curve) is a mathematical model that relates the cost per unit (or labor time per unit) to the cumulative number of units produced in an exponentially decreasing manner. |
EXPRESS | The information modeling language used to define the STEP standard (ISO 10303). |
Extranet | An internet-based network that provides controlled access to outside parties. Also see Intranet. |
Extreme Programming | Extreme Programming (XP) is one of the more popular lightweight, or agile development methods. In general, XP structures the “four basic activities of software development … coding, testing, listening, and designing.” XP structures coding based on the concepts of pair-programming and test-development. XP structures the testing activity by requiring automated tests that the team runs every day, several times a day. XP structures the listening activity through pair-programming and by requiring that the customer be part of the team and be on-site. Lastly, XP structures the designing activity by encouraging developers to use test-first development: define a test, then code until the test passes, then proceed to the next test. There is no big-design-up-front stage in an XP project. |
Failure | A deficiency, defect, nonperformance or nonconformance with specified requirements. An item of equipment has suffered a failure when it is no longer capable of fulfilling one or more of its intended functions. Note that an item does not need to be completely unable to function to have suffered a failure. |
Failure Analysis | Failure Analysis is a collection of techniques to determine the root cause of a component or process defect or failure. |
Failure in Time | Failure in Time – a reliability measure usually expressed in failures per 10 to the 9th power hours. |
Failure Mode | A particular way in which failures occur, independent of the reason for failure. |
Failure Modes and Effects Analysis | Failure Modes and Effects Analysis (FMEA) is a procedure in which each potential failure mode in every sub-item of an item is analyzed to determine its effect on other sub-items and on the required function of the item. It is used to identify potential failure modes and their associated causes/mechanisms, consider risks of these failure modes, and identify mitigating actions to reduce the probability or impact of the failure. |
Failure Modes, Effects and Criticality Analysis | Failure Modes, Effects and Criticality Analysis is a procedure that is performed after a failure mode and effects analysis to classify each potential failure effect according to its severity and probability of occurrence. |
Failure Reporting and Corrective Action System | Failure Reporting and Corrective Action System (FRACAS) is a closed-loop system to capture reports of failure from customers or service technicians in the field or from the factory, analyze these reports, detect trends or problems, and use this analysis to take corrective action in the design, component selection, supplier selection, manufacturing process, or operating manual of the product. Features of a FRACAS system include a database manager, tracking system for document controls, user definable reports which allow selection of data elements and sort options, and search functions. |
FAST | see Function Analysis System Technique |
FAT | Factory Acceptance Testing |
Fault Tree Analysis | Fault Tree Analysis is a top-down, hierarchical analysis of faults to identify the various fault mechanisms and their cause. It graphically describes the cause and effect relationships that result in major failures. The fault or major failure being analyzed is identified as the “top event.” All of the possible causes of the top event are identified in a tree using “or” nodes for independent causes and “and” nodes for multiple causes that must exist concurrently for a failure to occur. |
FCA | see Functional Configuration Audit |
FCT | Fast Cycle Time |
FDA | Food and Drug Administration |
FDL | Fault Detection and Localization (proposed IEEE standard) |
FEA | see Finite Element Analysis |
Feasibility | Capable of being completed to meet goals. The feasibility of a proposed new product has two dimensions: a) Technical Feasibility (i.e., will the product work?) and b) Financial Feasibility (i.e., will the product make an adequate return on investment for the enterprise?) |
Feature | 1. Features are elements of the product that provide a distinctive benefit to the customer and are often highlighted in describing the benefits of the product to the customer. Features are the differentiating functionality of a product. This functionality may not be available in other products, or it may not be available with the same quality characteristics. 2. Features are geometric entities that have meaning in the definition and manufacture of a product. Examples of features are through-holes, bosses, bends, chamfers, slots, etc. |
Features Technology | Features Technology – a variation of group technology with a focus on coding and classifying based on part features. |
FEM | see Finite Element Model |
Field Replaceable Unit | A collection of hardware or software that is installed or removed from a product as a single serviceable entity. The composition of product FRU’s is determined by the integrated product team. |
Field Testing | Field Testing is the testing a product in the actual context in which it will be used, as opposed to laboratory testing, or testing the product in its operating environment. |
Fillet | A manufacturing feature that blends two surfaces together. |
Finite Element Analysis | A computer-based method that breaks geometry into elements and links a series of equations to each, which are then solved simultaneously to evaluate the behavior of the entire system. Most often used for structural analysis, but widely applicable for other types of analysis and simulation, including thermal, fluid, and electromagnetic. |
Finite Element Model | Finite Element Model – the model that is created to be analyzed with the finite-element method, typically done graphically with geometry. |
Fit | The ability of an item to physically interface or interconnect with or become an integral part of another item. |
FIT | see Failure in Time |
Fixed Costs | A cost element that does not vary with changes in the volume of cost drivers or production volumes. The designation of a cost as fixed may vary depending upon the time horizon. In the long term, all costs are variable. |
Fixture | Tooling designed to locate and hold components in position. |
Flip Chip | Process for mounting an IC chip with metalization down and with no bonding wires. |
Floorplan | The high-level physical layout of blocks on a semiconductor device (either chips or boards). Floorplanning tools also typically provide estimations of timing delays. |
Floorplanners | Floorplanners are EDA software tools that provide an environment where issues such as timing, area, power dissipation, and routeability can be analyzed before a detailed physical layout of a design is completed. These tools provide interactive and/or automatic capability to accurately estimate interconnect resistance/capacitance (RC) and predict timing delays before the placement and routing of the design so that users can evaluate and choose the optimum floorplan. |
FMEA | see Failure Modes and Effects Analysis |
FMECA | see Failure Modes, Effects and Criticality Analysis |
FMS | Flexible Manufacturing System |
Focus Groups | Focus Groups are are meetings with a group of customers, users, or potential users of a product to explore their needs and obtain their feedback on product ideas and concepts. These meetings are conducted by a facilitator using a prepared script but provide the flexibility to obtain open-ended participant input and feedback. These meetings are often recorded and observed by company for further analysis. |
Forecasting | The work performed to estimate and predict future conditions and events. Of relevance to new product development is the need to forecast future sales of a proposed product to determine its financial feasibility. |
Form | The defined configuration of an item including the geometrically measured configuration, density, and weight or other visual parameters which uniquely characterize an item, component or assembly. For software, form denotes the language, language level and media. |
Formal Verification | The application of rigorous mathematical techniques to prove the functional equivalence of an electronic hardware design with its original specification. Because timing is not included in formal verification, it is used only to verify a design’s functional behavior. It is a collective term used for a number of different tools and methodologies. |
FPGA | Field Programmable Gate Array – a high density programmable logic device. |
FPT | Fine Pitch Technology (related to surface mount) |
FQR | Formal Qualification Review |
FRACAS | see Failure Reporting and Corrective Action System |
Framework | A software infrastructure that provides a common environment for communication and integration of design tools in a design process. |
Freeform Fabrication | Freeform Fabrication is the application of rapid prototyping (see Rapid Prototyping) and rapid tooling technologies to the direct manufacture of parts in an on-demand, low volume or a mass customization production environment. |
FRU | See Field Replaceable Unit |
FTA | see Fault Tree Analysis |
Full Scan | A design for testability methodology that provides complete access to an integrated circuit. EDA tools can insert scan registers automatically during logic synthesis. |
Function | An abstracted description of work that a product must perform to meet customer needs (for value analysis, sometimes stated in a noun-verb format, e.g., “transmit data”) |
Functional Requirements | Functional Requirements capture the intended behavior of the system or product – what the system will do. This behavior may be expressed as functions, tasks, or services the system or product is required to perform. Therefore, functional requirements do not include performance characteristics, operating conditions, use cases, and specifications. |
Functional Test | Functional Test is a test that identifies functional level faults in printed circuit board assemblies (PCBAs), including manufacturing related faults not identified by in-circuit tests (ICT), timing related failures, and faults internal to components. Functional test equipment operates at the same frequency the PCBA is designed for and may have the capability to margin temperature, voltage and frequency. |
Functional Worth | Functional Worth is equal to the least expensive way to perform a given function. |
Function Analysis | Synonymous with value analysis and value engineering. A methodology of focusing on those functions that are valuable to customers and delivering them at the lowest possible cost. |
Function Analysis System Technique | Function Analysis System Technique is a value analysis or function analysis technique to describe a system or product as a series of logically related functions and associate those functions to costs. This technique identifies less important functions that may then be eliminated, thereby reducing costs. |
Functional Configuration Audit | An engineering audit of a configuration item (CI) or system to verify that the performance test results of the item are in accordance with the performance specification of the item. See Design Validation and Validation. |
Function Cost Analysis | Function Cost Analysis is an accounting allocation of cost and importance to product function. It is a tool used to support value engineering or value analysis to identify high cost functions to address. |
Function Point Analysis | A top down software development estimating technique which was developed by A.J. Albrecht. It entails breaking a project down into ‘Function Points’ which are classified by degrees of complexity. Factors are then applied from which time estimates may be developed. |
Fuzzy Front End | The is the process for determining customer needs or market opportunities, generating ideas for new products, conducting necessary research on the needs, developing product concepts, and evaluating product concepts up to the point that a decision is made to proceed with development. This process is described as the fuzzy front end because it is the least defined and most unstructured part of product development. |
GA | see General Availability |
Gantt Chart | Gantt Chart is a diagram used in project management, where the x axis is time and the y axis shows tasks to be performed to complete the project. Each task is displayed as a horizontal bar spanning the time period during which it is expected to take place. Arrows may be drawn from one task to another to indicate dependencies (when one task can’t be begun until another is completed). The Gantt chart was developed by Charles Gantt in 1917. |
Gate | 1. A gate (or stage-gate) is a step where the merits and progress of the project are evaluated before further progress is allowed. A gate involves a review that often results in a “go/no go” decision for the project. 2. Another name for a logic cell (see Cell), which is a functional group of transistors having physical attributes that support a specific semiconductor process technology. |
Gate Count | A metric for the size of an ASIC design, usually expressed in terms of the equivalent number of basic 2-input NAND gates used. A gate count can be roughly converted to a transistor count by multiplying by a factor of four. |
Gatekeeper | Gatekeepers are the members of management that conduct the stage-gate or phase-gate reviews that are part of a stage-gate process in new product development. The gatekeepers are usually members of a formal group known as Product Committee or similar name that are charged with portfolio management and pipeline management. |
Gauge Repeatability and Reproducibility | Gauge Repeatability and Reproducibility (GR&R) is the evaluation of a gauging instrument’s accuracy by determining whether the measurements taken with it are repeatable and reproducible. Repeatability is the variation in measurement obtained with one measurement instrument when used several times by an appraiser while measuring the identical characteristic on the same part. Reproducibility is the variation in the averages of the measurements made by the different appraisers using the same measuring instrument when measuring the identical characteristic on the same part. |
GD&T | see Geometric Dimensioning and Tolerancing |
GenCAM | A product data exchange format standard for electronics manufacturing that characterizes, sorts and organizes data into intelligent schemes. This standard is represented in IPC-2511, Generic Requirements for Implementation of Product Manufacturing Description Data and Transfer Methodology. |
General Availability | The point in the product life cycle when production has been ramped-up to sufficient volumes and when product issues have been resolved so that the product is made available to all interested customers. |
Geometric Dimensioning and Tolerancing | Geometric Dimensioning and Tolerancing (GD&T) – ANSI-Y14.5 standard for showing the dimensioning and tolerancing on a drawing considering the functions or relationships of part features. GD&T depicts the geometric relationship of part features (instead of the Cartesian relationship), allowing the maximum tolerance which permits full function of the product. |
GIDEP | Government/Industry Data Exchange Program (for electronic components) |
GLP | Good Laboratory Practices |
GMP | Good Manufacturing Practices |
Go/No-Go Gauges | Go/No-Go Gauges are gauges that provides categorical data about whether one or more dimension of a workpiece is within specification limits. |
Graceful Degradation | Graceful Degradation is the quality of a product, system or design such that when something goes wrong, it happens a little at a time and with plenty of opportunity to take action to correct the problem or at least protect against its worst consequences. |
GR&R | see Gauge Repeatability and Reproducibility |
Group Technology | Group Technology is a coding and classification system to identify similarities in part geometry, features, characteristics and processes. Its is used to aid in design retrieval, part standardization, manufacturing cell design, and production scheduling. |
Groupthink | An undesirable condition in which all members of a group (e.g. a project team) begin to think alike or pretend to think alike. No members are then willing to raise objections or concerns about a project even though they are legitimate and based on hard data. |
Groupware | Groupware is technology designed to facilitate the work of groups. This technology may be used to communicate, cooperate, coordinate, solve problems, compete, or negotiate. The term is ordinarily used to refer to a specific class of technologies relying on modern computer networks, such as email, newsgroups, video conferencing, collaborative conferencing, group presentation, workflow, instant messaging, and chat. |
Graphical User Interface | Graphical User Interface – an interface to a computer that uses icons to represent desktop objects, such as documents and programs, that the user can access and manipulate with a pointing device, such as a mouse. |
Groupware | Groupware is any type of software designed for groups and for communication, including email, video conferencing, workflow, chat, and collaborative editing systems. This technology may be used to communicate, cooperate, coordinate, solve problems, compete, or negotiate. While traditional technologies such as the telephone qualify as groupware, the term is ordinarily used to refer to a specific class of technologies relying on modern computer networks, such as email, newsgroups, videophones, or chat. Groupware technologies are typically categorized along two primary dimensions: a)whether users of the groupware are working together at the same time (“realtime” or “synchronous” groupware) or different times (“asynchronous” groupware), and b) whether users are working together in the same place (“collocated” or “face-to-face”) or in different places (“non-collocated” or “distance”). |
GUI | see Graphical User Interface |
HA | see Hazard Analysis |
HALT | see Highly Accelerated Life Test |
HASS | see Highly Accelerated Stress Screen |
Hardware Configuration Item | A hardware component of a system, which is designated for configuration management to insure the integrity of the delivered product. It may exist at any level in the system hierarchy, since configuration management must be imposed down to the lowest level where item interchangeability is required. Each HWCI is to have (as appropriate) individual design reviews, individual qualification/certification, individual acceptance reviews, and separate operator and maintenance manuals. |
Hardware Description Language | A language that describes the physical design, electronic behavior, logical structure, and system annotation information for circuits. An HDL allows a design to be described in a higher level of abstraction while supporting a logical synthesis path to gate-level implementation. |
Hardware Software Co-Design | The process and software tools that perform or support hardware/software partitioning, performance evaluation, and design entry for system-level designs that are comprised of both hardware and software elements, as in embedded systems. This includes tools and interfaces that link the design and evaluation steps with code compilation models. |
Hazard Analysis | Hazard Analysis is the detailed examination of a product from the user perspective to detect potential design flaws (possibilities of failure that could cause harm) and to enable manufacturers to correct them before a product is released for use. |
HDL | see Hardware Description Language |
Hidden Failure | A failure which, on its own, does not become evident to the operator or user under normal circumstances. |
Hierarchical Design | Hierarchical Design is a design methodology where portions of large designs are divided into manageable sections or sub-blocks that may be created, represented symbolically, designed, and then connected together when completed. This methodology allows different parts of the design to be worked on in parallel. |
Highly Accelerated Life Test | Highly Accelerated Life Test (HALT) is a process developed to uncover design defects and weaknesses in electronic and mechanical assemblies using a vibration system combined with rapid high and low temperature changes. The purpose of HALT is to optimize product reliability by identifying the functional and destructive limits of a product. HALT addresses reliability issues at an early stage in product development. |
Highly Accelerated Stress Screening | Highly Accelerated Stress Screening (HASS) is a technique for production screening that rapidly exposes process or production flaws in products. Its purpose is to expose a product to optimized production screens without affecting product reliability. Unlike HALT, HASS uses nondestructive stresses of extreme temperatures and temperature change rates with vibration. |
HPS | Harmonization of Product Data Standards – An organization sponsored by ANSI to oversee and coordinate the harmonization of electrical/electronic data standards. |
HTML | Hyper-Text Mark-up Language – the mark-up language used as the basis for the world-wide web. |
Human Factors | Human Factors refers to the characteristics of human beings that are applicable to the design of systems and devices of all kinds. It furthers serious consideration of knowledge about the assignment of appropriate functions for humans and machines, whether people serve as operators, maintainers, or users in the system. And, it advocates systematic use of such knowledge to achieve compatibility in the design of interactive systems of people, machines, and environments to ensure their effectiveness, safety, and ease of performance. |
Hurdle Rate | The minimum return on investment or internal rate of return percentage a new product must meet or exceed for it to be approved for investment with development. |
HWCI | see Hardware Configuration Item |
Ideation | Ideation is the idea generation phase or stage of new product development. These ideas came come from internal sources such as Marketing or Engineering as well as external sources such as customers, suppliers, retailers or consultants. |
IC | Integrated Circuit |
ICD | see Interface Control Drawing/Document |
ICT | see In-Circuit Test |
IDDQ | IDDQ tests certain kinds of manufacturing defects such as a short circuit that makes a device draw excessive current. They measure device current when a specific set of test vectors is applied, and devices over a threshold current level can be rejected during test. |
IDEF | Integrated Definition Language (formerly ICAM Definition Methodology) |
IDEF0 | IDEF Functional Modeling |
IDEF1 | IDEF Information Modeling |
IDEF 1X | IDEF Semantic Modeling |
IDOV | IDOV is a methodology for designing products and services to meet six sigma standards. It stands for a four-phase process that consists of Identify, Design, Optimize and Verify. |
IEC | International Electrotechnical Commission |
IEEE | Institute of Electrical and Electronic Engineers |
IETM | Interactive Electronic Technical Manual |
IGES | see Initial Graphics Exchange Specification |
IMP | Integrated Master Plan |
IMS | Integrated Master Schedule |
In-Circuit Test | In-Circuit Test (ICT) is a combination of hardware and software that identifies manufacturing induced faults of printed circuit board assemblies (PCBAs) by isolating and individually testing devices using a bed-of-nails fixture. Potential faults include shorts, opens, wrong components, missing components, etc. |
Inclusive Design | Inclusive design is a design approach whereby designers insure that their products and services address the needs of the widest possible audience. Inclusive design is intended to address groups such as the aged or disabled that many products otherwise would not be suitable. The seven principles of inclusive design are (1) equitable use, (2) flexibility in use (3) simple and intuitive use, (4) perceptible information, (5) tolerance for error, (6) low physical effort, and (7) size and space for approach and use. |
INCOSE | International Council on Systems Engineering |
Incremental Development | A hardware/software development process that produces a partial implementation and then gradually adds preplanned functionality or performance in subsequent increments. This contrasts with the Waterfall Model where all functionality is delivered at one time at the conclusion of the project. Similar to the Spiral Development Model. |
Indirect Costs | Costs that are incurred for common or joint objectives that can not be readily identified to a single cost objective and readily treated as a direct cost. A cost that is allocated as opposed to being traced. |
Industrial Design | Industrial Design is the design that is done in companies and consultancies by people trained in industrial design, or in art and design schools in general. Industrial design focuses on the physical form and interactive properties as opposed to the functioning of the product or system. |
Infant Mortality | The high conditional probability of failure due to manufacturing defects, design, installation, or startup procedures during the period immediately after an item enters service. |
Initial Graphics Exchange Specification | Initial Graphics Exchange Specification (IGES) is a neutral file format used to exchange vector and text data among CAD systems. It was the first widely-used neutral file format used for mechanical CAD data interchange. |
Instance | 1. As used in product design, an instance is a reference to a geometric object that allows the same geometry to be located at several places in a geometric model assembly without actually copying the geometry. When the original geometry is modified the modifications automatically appear at every instance location 2. With product structures, an instance is a reference to a part. It allows the same part to be used in several assemblies without copying all part information into the assembly. |
Integral Architecture | An integral architecture (as opposed to a modular architecture) is a product architecture where 1) the functions of a product are performed using more than one physical building block (e.g., subsystems or subassemblies), 2) a single building block performs multiple functions, 3) the functions are closely coupled and are tightly synchronized, 4) the functions are in close physical proximity, and 5) the interactions between building blocks or interfaces are greater in number and are less well-defined or standardized. |
Integrated Product and Process Development | Integrated Product and Process Development is synonymous with concurrent engineering (CE), concurrent product development (CPD), integrated product development (IPD), etc. See Integrated Product Development for definition. |
Integrated Product Development | Integrated Product Development (synonymous with concurrent engineering (CE), concurrent product development (CPD), integrated product and process development (IPPD), etc.) – a philosophy that systematically employs a teaming of functional disciplines to integrate and concurrently apply all necessary processes to produce an effective and efficient product that satisfies the customers needs. |
Integrated Product Team | A cross-functional team consisting of representatives from marketing, engineering, manufacturing, finance. purchasing, test, quality, finance and any other required disciplines with responsibility for developing a product or product subsystem. This team is empowered to represent the functional disciplines and develop a product by addressing its life cycle requirements including its product and support. |
Integration Testing | Integration Testing is conducted to validate that two or more subsystems or components are properly working together. Integration testing usually follows or is conducted in parallel with subsystem or unit testing. |
Intellectual Property | Intellectual Property – Proprietary knowledge, design information, or other intangible information or representations that have value to an organization or individual. In electronic system design, this is design information (e.g., cells, cores, etc.) packaged for re-use whose ownership must be addressed before it can be used. |
Interface | The functional and physical characteristics required to exist at a common boundary between components (hardware or software), assemblies or subsystems. |
Interface Control | In configuration management, the process of: a) identifying all functional and physical characteristics relevant to the interfacing of two or more configuration items provided by one or more organizations, and b) ensuring the proposed changes to these characteristics are evaluated and approved prior to implementation. |
Interface Control Drawing/Document/ Specification | A drawing, document or specification that is used to define and control the physical and functional interface between two or more subsystems within an overall system. These subsystems are typically designed and developed by different parties. |
Interference Checking | The process of identifying if and where two or more geometric objects intersect in either a static or dynamic state. |
Internal Rate of Return | Internal Rate of Return (IRR) is the rate of return at which the present value of benefits equals the present value of costs/investments. |
Internationalization | Internationalization is the design or modification of a product hardware and software for an international audience. Three approaches to internationalization are common. First, globalization involves using international standards, universal power supplies, use of international symbols, and making a monolingual interface more accessible for non-native speakers and international use. Second, a multi-lingual approach allow users to choose a translation of instructions, manuals, and interfaces in their own language. Third, localization involves customization of the the product, instructions, and the user interface for each local region in which it will be used, by using the local language and taking advantage of local conventions, standards, assumptions, and common defaults. |
Interoperability | The ability of systems, units, or forces to provide services to, and accept services from, other systems, units, or forces and to use the services so exchanged to enable them to operate effectively together. |
Intranet | A private network within an organization that uses the same technologies as the Internet (TCP/IP, HTTP, HTML and browsers) to provide the capabilities to create, publish and access information for that organization’s users. |
IP | see Intellectual Property |
IPC | Institute for Interconnecting and Packaging Electronic Circuits |
IPD | see Integrated Product Development |
IPO | IGES/PDES Organization |
IPPD | Integrated Product and Process Development is synonymous with concurrent engineering (CE), concurrent product development (CPD), integrated product and process development (IPD), etc.) etc. See Integrated Product Development for definition. |
IPT | See Integrated Product Team |
IRS | Interface Requirements Specification |
ISO | International Standards Organization is a specialized international agency for standardization composed of the national standards bodies of 91 countries. |
ISO 9000 | ISO 9000 is a set of international standards on quality management and quality assurance developed to help companies effectively document the quality system elements to be implemented to maintain an efficient quality system. The standards, initially published in 1987, are not specific to any particular industry, product or service. The standards underwent major revision in 2000 and now include ISO 9000:2000 (definitions), ISO 9001:2000 (requirements) and ISO 9004:2000 (continuous improvement). |
ISO 10303 | An ISO technical standard for product data representation and exchange commonly referred to as STEP or the Standard for the Exchange of Product Model Data. |
ISO/TS 16949:2002 | An ISO technical standard titled “Quality management system – particular standards for the application of ISO 9001:2000 for automotive production and relevant service part organizations”. This standard replaces QS-9000 and harmonizes requirements for automotive manufacturers internationally. |
ITC | InterTool Communications – part of the CFI standards that enable applications to communicate events and data to each other at run time. ITC is the basis for achieving operations such as cross highlighting logic in both the front end schematic capture and board layout tools. |
JAD | see Joint Application Development |
Java | A programming language developed by Sun Microsystems that can be run as a virtual machine on many computer platforms. Many applications such as product data management (PDM) and enterprise resource planning (ERP) are being re-architected to run certain processes using Java to make them widely available regardless of a user’s platform. |
JDM | see Joint Development Model |
JEDEC | Joint Electron Device Engineering Council |
Jig | A Jig is a device that holds the workpiece securely in the correct positions and has the capability of guiding the tool during a manufacturing operation. |
JIT | Just-in-Time Production (See Lean Manufacturing) |
Joint Application Development | Joint Application Development (JAD) was developed at IBM Canada in the 70’s. Joint Application Development/Design is a group session approach that stresses the communication between a multi-disciplinary group brought together for the express purpose of generating system requirements and preliminary design. |
Joint Development Model | A model of partnership with an external manufacturer to jointly design a product that will be produced by that manufacturer. The responsibilities for development and the ownership of the intellectual property are negotiable. |
JTAG | Joint Test Action Group – the informal name for IEEE/ANSI Standard 1149.1-1990 which is a set of design rules for testing at the IC level. |
JUSE | Japanese Union of Scientists and Engineers |
Kaizen | A Japanese term describing a process or philosophy of continuous, incremental improvement. |
Kano Model | Th Kano Model, developed by Dr. Noriaki Kano, further refined the notion of quality quality along two dimensions in contrast to the linear “good-bad” “ok-not ok” dimension in existence all along. The two dimensions were: 1) The degree to which a product or service performs, and 2) The degree to which the user is satisfied. The correlation of quality on two axes further led to three unique definitions of quality, namely: Basic Quality, Performance Quality and Excitement Quality. |
KBE | see Knowledge-Based Engineering |
KGD | Known Good Die (integrated circuits) |
Kinematic Analysis | Kinematic Analysis is the analysis of motion without regard to forces that cause it. (Also see Dynamic Analysis.) Kinematic simulations show the physical positions of all the parts in an assembly with respect to the time as it goes through a cycle. |
KLOC | 1000 Lines of Code |
Knowledge-Based Engineering | Knowledge-Based Engineering is a set of design automation tools that capture design knowledge and rules to automate the design process. |
Knowledge Management | The overall management process to capture, organize, manage and disseminate knowledge in an organization to improve enterprise effectiveness by avoiding mistakes and avoiding the time to relearn needed knowledge. Since product development is very knowledge intensive, knowledge management offers tremendous leverage and opportunity for improvement. |
KPIV | Key Process Input Variable (Six Sigma term) |
KPOV | Key Process Output Variable (Six Sigma term) |
KSLOC | 1000 Source Lines of Code |
Laboratory | 1. A Laboratory is a test facility that may include chemical, metallurgical, dimensional, physical, electrical, reliability testing or test validation. 2. A Laboratory is a research facility that supports development and testing under controlled conditions. |
LAN | Local Area Network |
Layout | 1. For ICs, the process of floorplanning, implementing, and verifying the location of transistors and their connections within a chip design. 2. For PCBs, the process of entering, placing, routing, and verifying the location of physical components and their connections within a board design. |
LCA | see Life Cycle Analysis |
LCC | 1. see Life Cycle Cost 2. Leaded Chip Carrier – a square chip carrier with pins on all four sides. |
LCL | Lower Control Limit is the lower limit used within statistical process control that define the constraints of common cause variations. When a parameter value falls below the lower control limit, it flags the occurrence of special causes contributing to variation. |
Lead Customers / Users | Lead customers or users are those customers or users who are the most advanced users of the product, customers who are pushing the product to its limits, or customers who are adapting an existing product(s) to new uses. |
Lean Manufacturing | Lean Manufacturing is a operations philosophy that aims to synchronize production with demand, thereby minimizing inventory and cycle time. Lean Manufacturing is supported during product development with approaches such as robust design, mistake-proofing and standardization. |
Lean Product Development | Lean Product Development is based on the application of the lean thinking principles to developing new products. This starts with defining what is of value to the customer, eliminating waste in the design of a new product by actions to achieve its target cost and making the product manufacturable. It also focuses on eliminating waste in the development process and making the value-creating steps flow with techniques such as pipeline management and pull scheduling. Finally, Lean Product Development requires organizing the right resources on the development team and empowering the team. The final step is to focus on learning, amplify learning across the organization, and continuously improving. |
Lessons Learned | Lessons Learned refers to specific lessons that are experienced, learned, and captured or knowledge that is gained during the execution of a project or activity. Lessons learned are captured and documented for others in the organization to learn from, use to improve their performance on a project, and avoid repeating with negative consequences. |
Level of Detail | Level of Detail – the ability to vary the amount of details displayed in a graphics image to improve performance. For instance, at a distance, models can appear as simple 3D figures, but as users zoom in, a more detailed representation is presented. |
Life Cycle Analysis | Life Cycle Analysis is a method to assist with the quantification and evaluation of environmental burdens and impacts associated with product systems and activities, from the extraction of raw materials in the earth to end-of-life disposal. LCA is increasingly used by industries, governments and environmental groups to assist with decision making for environment-related strategies and materials selection. |
Life Cycle Cost | Life Cycle Cost is the total cost to the customer of acquiring, operating, and disposing of a product/ system over its full life. These costs include development, acquisition, installation, training, operation, support, and disposal. |
LOC | Lines of Code (software) |
Localization | Localization involves customization of the the product, instructions, and the user interface for each local region in which it will be used, by using the local language and taking advantage of local conventions, standards, assumptions, and common defaults. See Internationalization. |
LOD | see Level of Detail |
LPGA | Laser Programmable Gate Array |
LRIP | Low Rate Initial Production |
LRU | 1. Line Replaceable Unit (see Field Replaceable Unit) 2. Lowest Replaceable Unit (see Field Replaceable Unit) |
LSA | Logistics Support Analysis |
LSAR | Logistics Support Analysis Record |
LSL | Lower Specification Limit is the lower limit for a parameter value in order to meet specifications. |
Manufacturability | The characteristic of a product’s design that facilitates the fabrication of the product’s components and their assembly into the overall product. |
Maintainability | A characteristic of design and installation which inherently provides for an item to be retained in, or restored to a specified condition within a given period of time, when the maintenance is performed in accordance with prescribed procedures and resources. In other words, it is the ease and speed with which any maintenance activity can be carried out on an item of equipment. Maintainability may be measured by Mean Time to Repair. Synonymous with Serviceability and Supportability. |
Mass Customization | Mass Customization is the evolutionary step beyond mass production in manufacturing whereby products can be customized to the needs of individual customers while achieving most of the economies of mass production. It is based on principles such as product line rationalization, standardization of components, modular design, postponement of customization to late in the production cycle, multi-function assemblies, use of software to customize product operation, etc. |
MBOM | Manufacturing Bill of Material |
MCAD | Mechanical Computer-Aided Design |
MCAE | Mechanical Computer-Aided Engineering |
MCM | Multi-Chip Module – a type of hybrid integrated circuit in which multiple bare chips are mounted and interconnected on a substrate, base material or laminate. |
MDA | Manufacturing Defect Analyzer (automated test equipment) |
ME | 1. Mechanical Engineer 2. Manufacturing Engineer |
Mean Time Between Failures | Mean Time Between Failures (MTBF) is a measure of the reliability of a product or piece of equipment. It is equal to the number of failures in a given period divided by the total equipment uptime in that period. It represents the average time between failures for a repairable product for a defined unit of measure (e.g., operating hours, cycles, miles, etc.). |
Mean Time to Repair | Mean Time to Repair (MTTR) is a measure of maintainability of a product or piece of equipment. It is equal to the total or the estimated downtime of the product or equipment in a given period divided by the number of failures or the number of repairs performed in that period. |
MEMS | Micro-Electronic Mechanical Systems |
MET | Materials, Energy & Toxicity. A metric to measure environmental impact of a product. This metric was developed by developed by TNO, a research organization in The Netherlands. |
Milestone | Milestone – an important event representing the completion of a major work task or group of work tasks. Milestones are usually scheduled and can be used to measure progress. Reviews are often conducted upon the completion of a milestone. |
MIL-SPEC | Military Specification |
MIL-STD | Military Standard |
Mistake-Proofing | Mistake-Proofing – improving product designs, tooling designs, or processes to prevent mistakes from being made or to quickly and easily detect or mitigate the effect of a mistake. Mistake proofing involves six principles: elimination, replacement, prevention, facilitation, detection, and mitigation (see mistake-proofing for examples). Synonymous with error-proofing and poka-yoke. |
MMIC | Microwave Monolithic Integrated Circuit – device in which active elements such as transistors and diodes are combined with passive elements such as resistors, capacitors, inductors and transmission lines on a single GaAS substrate. These circuits replace conventional chips and wire in microwave circuits and are used as amplifiers, attenuators or switches at microwave/millimeter wave frequencies with the benefits of reduced size, lower costs and improved reliability. |
Modular Architecture | A modular architecture (as opposed to an integral architecture) is a product architecture where 1) the physical building blocks (e.g., subsystems or subassemblies) perform one or a small number of functions in their entirety, 2) the interactions between the building blocks or interfaces are minimal, well-defined, and generally fundamental to the primary functions of the product, and 3) the building block elements are discrete, interchangeable and individually upgradeable. |
Modular Design | Modular Design consists of combining standardized building blocks or “modules” in a variety of ways to create unique finished products. Thus, even though the parts and assemblies may be standardized, the finished product is unique. |
Morphological Analysis | Morphological analysis is used to identify the necessary product functionality and explore alternative means and combinations of achieving that functionality. For each element of product function, there may be a number of possible solutions. The morphological chart is prepared and used to develop alternative combinations of means to perform functions and each feasible combination represents a potential solution. |
MOU | Memorandum of Understanding |
MRB | Material Review Board – a group that meets periodically within a company to review non-conforming materials and products to determine their disposition and use. |
MTBF | see Mean Time Between Failure |
MTM | Methods-Time-Motion – a methodology for developing time standards. |
MTTR | see Mean Time to Repair |
Multi-Physics Analysis | Multi-Physics Analysis involves the use of computer-aided engineering tools to model multiple physical effects of a design such as structural, thermal, fluid/air flow, acoustic, electromagnetic, etc. This type of analysis allows consideration of the interaction of multiple effects with one another such as fluid flow and thermal. |
NC | Numerical Control |
NCGA | National Computer Graphics Association |
NDE | Non-Destructive Evaluation |
NDI | 1. Non-Developmental Item 2. Non-Destructive Inspection. See Non-Destructive Inspection / Test. |
NDT | Non-Destructive Test. See Non-Destructive Inspection / Test. |
Non-Recurring Cost, Non-Recurring Expense | Non-recurring costs are the one-time costs of researching, developing and testing a new product; the prodction start-up costs of tooling and equipment; and the product launch costs. Alternatively, these costs may be referred to as Non-Recurring Engineering. |
Net Present Value | Net Present Value is a financial analysis technique that discounts a series of cash inflows (revenue) and outflows (investments and expenses) to determine the suitability of an investment. NPV is an evaluation technique used to screen new product development projects as part of portfolio management. |
New Product Development | New Product Development is the business process for developing new hardware, software and service products for the enterprise. It includes all activities from development of the idea or concept for the product, the development of the product and its processes, and the launch of the product into production and into the market place. |
New Product Introduction | 1. New Product Introduction refers to the set of activities that occur once a product has been developed and is ready to be introduced into the marketplace. See Product Launch. 2. New Product Introduction is used by some organizations synonymously with new product development. |
NGT | see Nominal Group Technique |
NIH | “Not Invented Here” |
NIPDE | National Initiative for Product Data Exchange |
NIST | National Institute of Standards and Technology |
Nominal Group Technique | Nominal Group Technique, similar to brainstorming, is used by teams to generate ideas on a particular subject. Team members are asked to silently write down as many ideas as possible. Each member is then asked in turn to share one idea which is recorded. After all ideas are recorded, they are discussed and prioritized by the group. |
Nonconformance | A Nonconformance is product or material which does not conform to the customer requirements or specifications. |
Non-Destructive Inspection / Test | The inspection or test of the product or part that retains the product’s or part’s physical and operational integrity. |
Non-Value-Added Activity | An activity that is considered not to contribute to customer value or the organization’s needs. Designation as non-value added reflects the belief that the activity can be eliminated, redesigned, reduced without reducing the quantity, quality or responsiveness of the output to the customer or the organization. |
NPDP | New Product Development Professional – PDMA certification of expertise in new product development processes and practices. |
NPI | see New Product Introduction |
NPIT | New Product Introduction Team(s) |
NPV | see Net Present Value |
NRE | 1. Non-Recurring Expense 2. Non-Recurring Engineering |
NURBS | Non-Uniform Rational B-Spline – method of representing curves and surfaces in CAD system using B-splines and algorithms to represent any complex curve or surface as a single equation by breaking them up into many pieces. |
OA | see Orthogonal Array |
Object | The term Object is used to mean a collection of attributes that represent either a physical or logical artifact. For example an Object can represent all the information required for an item or a drawing. The key feature of an Object is that it represents data that can be manipulated as a group, so copying an item Object copies all the attributes associated with the object in one action. Objects are specialized into Classes. An object representing a particular type of bolt, for example, could be in the Class of Objects called bolts. Objects have ‘methods’. Methods are ways in which the object can be accessed, modified, displayed, etc. Another feature of Objects is inheritance. One class can be based upon another, the new class is called a subclass. The new class will inherit all the attributes of the other class. So you could have a class called bolt with attributes, length, thread type and pitch. You can then create a subclass called brass. The new class would inherit the attributes length, thread type and pitch from the bolt class and have a new attribute ‘brass’. |
ODM | see Original Design Manufacturer |
OEM | see Original Equipment Manufacturer |
OODBMS | Object-Oriented Data Base Management System |
OOP | Object-Oriented Programming |
Open Innovation | 1. Open innovation is the concept of looking beyond one’s own organization for innovations, technology and intellectual property as a basis for products, services, and processes. It recognizes that in a world of widely distributed knowledge, companies cannot afford to rely entirely on their own research, but should instead buy or license processes or inventions (i.e. patents) from other companies. In addition, internal inventions not being used in a firm’s business should be taken outside the company (e.g., through licensing, joint ventures, spin-offs). In contrast, closed innovation refers to processes that limit the use of internal knowledge within a company and make little or no use of external knowledge. 2. “Open innovation is the use of purposive inflows and outflows of knowledge to accelerate internal innovation, and expand the markets for external use of innovation, respectively. [This paradigm] assumes that firms can and should use external ideas as well as internal ideas, and internal and external paths to market, as they look to advance their technology.” (Chesbrough) |
Opportunity Cost | The economic value of the benefit that is sacrificed when an alternative course of action is taken. |
Original Design Manufacturer | Original Design Manufacturer – An external manufacturer who assumes responsibility for the design, development and manufacture of a company’s products. While the OEM defines requirements, may define elements of the architecture, and owns the intellectual property, the design and manufacture is done by the ODM. |
Original Equipment Manufacturer | The manufacturer whose name goes on a product and who markets and supports the product. In the past, this was the organization that had the highest level of manufacturing, test, integration and/or distribution responsibility in the supply chain. Increasingly, one or more of these activities are being outsourced to other manufacturers. |
ORT | On-going Reliability Testing |
Orthogonal Array | Orthogonal Array – an array to represent a fractional factorial experimental design in design of experiments. |
O&S | Operation and Support |
OVI | Open Verilog International |
P3I | Pre-Planned Product Improvement |
PAC | Pad Array Carrier (surface mount technology) |
PAL | Programmable Array Logic (see PLD or PLA) |
Parametric | A capability of 2D and 3D modeling systems in which the user defines dimensions and constraints to which the model must conform. Alterations are then automatically reflected in related areas. |
Parametric Cost Estimating | A cost estimating methodology using statistical relationships between historical costs and project and product parameters gathered from similar, but different projects. This methodology typically uses parameters such as weight, power, lines-of-code, or other characteristics of the product or system to estimate or to scale the development cost, product cost and/or schedule. System complexity and team maturity are also influencing factors. |
Pareto Analysis/ Diagram | An analysis/diagramming technique using frequency of occurrence to identify and display results generated by each identified cause. This analysis is commonly used to decide where to apply initial effort for maximum effect. See Pareto Principal. |
Pareto Principle | The Pareto principle suggests that 20% of a set of independent variables is responsible for 80% of the result. In quantitative terms, for example, 80% of the problems come from 20% of the causes (machines, raw materials, operators etc.). Therefore, effort aimed at the right 20% can solve 80% of the problems. |
Part Classification |
Classification of parts or other elements of a product by their geometry, material, function and/or the processes used to manufacture them (see Group Technology). Part classification is used to find components or subassemblies to use in a product design and to aid in standardization efforts. |
Part Model |
A data model that contains the complete geometric and functional representation of a part and its characteristics. A comprehensive part model would also contain related analysis, configuration, manufacturing and support data. |
Participatory Design | Participatory Design refers a democratic approach to design that encourages participation in the design process by a wide variety of stakeholders, such as: designers, developers, management, users, customers, salespeople, distributors, etc. The approach stresses making users not simply the subjects of user testing, but actually empowering them to be a part of the design and decision-making process. This is accomplished through direct involvement with the product development team on major projects for one or a small number or customers or through frequent customer or user review and feedback during the development process using mechanisms such as focus groups, web-based customer participation, usability studies, etc. |
PART-LIB | Parts Library (ISO 13584) An international standard that will offer the capability for computer-sensible representation and exchange of part library data. |
PCA | 1. see Physical Configuration Audit 2. Printed Circuit Assembly |
PCB | see Printed Circuit Board |
PDCA | Plan, Do, Check, Act – a four step process for quality improvement, sometimes referred to as the Deming cycle. |
PDM | see Product Data Management |
PDMA | Product Development Management Association |
PDR | Preliminary Design Review |
PDT | see Product Development Team |
Peer Review | The review of work products during their development that is performed by peers to identify defects for removal. |
PEP | Production Engineering and Planning |
Perceptual Map | A Perceptual Map is a visual method for comparing customer perceptions of different products considering two different characteristics of those products. It is used to show relationships between marketplace competitors and the criteria used by buyers in making purchase decisions and recommendations. Perceptual maps may be used for market segmentation, concept development and evaluation, and tracking changes in marketplace perceptions. |
PERT | see Program Evaluation and Review Technique |
PFMEA | see Process Failure Modes and Effects Analysis |
PGA | Pin Grid Array – a chip housing with a high density of pins that is used for large amounts of I/O. |
Physical Configuration Audit | An engineering inspection of a configuration item (CI) to verify that the item “as-built” conforms to the “as-designed” documentation. |
Physics of Failure | Analysis to determine the physical causes for the failure of electronics components or assemblies. |
Pilot Production | The initial limited-quantity production of the production-ready version of the product design used to confirm readiness for large quantity production. |
PIM | Product Information Management. See Product Data Management |
PIP | Product Improvement Program |
Pipeline Management | Pipeline management is the process of managing new development projects that are currently in the pipeline (both proposed and approved). This addresses the management of capacity and resources to undertake the selected projects and the coordination of cross-functional resources to optimize throughput. |
Pitch | The centerline spacing from one electronic device pin to another. |
PLA | Programmable Logic Array |
Plated-Thru-Hole | Plated-Thru-Hole is a method of obtaining electrical connection between components and substrate (printed circuit board) by soldering component leads (or pins) inserted in plated through-holes. |
PLC | see Product Life Cycle |
PLD | Programmable Logic Device |
PLM | see Product Lifecycle Management |
PM | Program / Project Manager |
PMI | Project Management Institute |
PMT | Program Management Team |
POF | see Physics of Failure |
Poka-Yoke | Japanese term for mistake-proofing of product, tooling and/or process by design. See Mistake-Proofing. |
Portal | A web site that provides a comprehensive set of services and information for a particular audience such as product development personnel. The goal of such a site is to be a one-stop web resource for its target audience. |
Portfolio Management | The process of managing new product ideas, proposed projects and current projects under development as a portfolio to 1.) maximize the value of the portfolio, 2.) keep it in balance, and 3.) align it with company strategy. By characterizing and reviewing the projects in a company’s portfolio as a whole, a big picture is presented and used to prioritize and select projects. |
Postponement | Postponement, also known as “delayed differentiation,” is a supply chain strategy that delays product differentiation at a point closer to the customer. This involves designing and developing standard or generic configurable products that can be customized quickly and inexpensively once actual consumer demand is known. Postponement is a key element of design for leadtime and design for the supply chain. |
PPL | Preliminary Parts List |
PPM | Parts per Million (defects) |
PRICE | Programmed Review of Information for Costing and Evaluation – software program for cost estimating. |
Printed Circuit Board | A circuit for electronic apparatus made by depositing conductive material in continuous paths from terminal-to-terminal on an insulating surface. |
Process Capability | 1. (Statistical definition) Process Capability is the repeatability and consistency of a manufacturing process relative to the customer requirements in terms of specification limits of a product parameter. Specifically, it is the 6 sigma range of common cause variation for statistically stable processes only. Process capability is measured with the indices Cp and Cpk. 2. (Manufacturing process definition) Process capability is a measure of the manufacturability of the product considering availability of desired manufacturing processes, support or workpiece size, equipment characteristics (e.g., speeds, feeds, tonnage, etc.), and statistical capability as defined above. 3. (Business process definition) The extent to which a process is explicitly documented, managed, measured, controlled, and continually improved. |
Process Development | Defining and developing a manufacturing process to accommodate the specific requirements of a given product while meeting process quality and cost objectives. |
Process Failure Modes and Effects Analysis | Process Failure Modes and Effects Analysis – a form of FMEA associated with the process design, equipment design and management (see FMEA). |
Process Planning | Process Planning is the analysis and design of the sequence of processes, resources requirements needed to produce products into workable instructions for manufacture. It also includes the specification and selection of tools, fixtures, equipment and inspection/test requirements. |
Product Architecture | The scheme by which the functional elements of a product are arranged into physical building blocks (e.g., subsystems or subassemblies) and interact with each other to perform the overall function of the product. Product architectures can be modular (see Modular Architecture) or integral (see Integral Architecture). |
Product Brief / Project Brief | A summary document that communicates essential information about the product to be developed and is used to guide the development effort. Product briefs contain a description of the product including distinguishing characteristics; critical technology and its status; a description of the intended market or customer; a product strategy and its basis of competition; the target cost and target price; essential project information (boundary conditions) such as development cost, development schedule, milestones, and required resources/personnel; and significant risks. See Example. |
Product Cofigurator | A configurator is a software application that allows users to select product options, while validating that the selected options are compatible and workable. Configuration rules or constraints are used to determine which options can be used together on which products, and ensure that the end-product is properly configured. |
Product Data | A representation of information about a product in a formal manner suitable for communication, interpretation, or processing by human beings or by computers. |
Product Data Management / Product Information Management | Product Data Management and Product Information Management are synonymous terms referring to a software tool that maintains a repository of product data and provides functions such as access control, managing parts and assemblies, bill of material structuring, version control/engineering change control, workflow management/process management (see Workflow Systems), web publishing and reviewing, view and markup, database backup, and user administration. |
Product Development Team | Product Development Team – a team consisting of representatives from marketing, engineering, manufacturing, finance. purchasing, test, quality, finance and any other required disciplines with responsibility for developing a product or product subsystem. This team is empowered to represent the functional disciplines and develop a product by addressing its life cycle requirements including its product and support. |
Product Family | A product family is a set of individual products that share common technology and address a related set of market applications. Product families include a number of products or product lines targeted at somewhat different markets or usage situations. |
Product Launch | Product Launch begins once a product has been developed and a decision is made to proceed with production and marketing. It consists of all of the steps to plan and prepare for production of the product including ramp-up to full-volume production or general availability; the steps to plan, promote, market and sell the product, and the steps to prepare for servicing and support of the product. |
Product Life Cycle | 1. The Product Life Cycle from a Marketing perspective is typically defined by its sales volume profile and broken down into the following phases: introduction, growth, maturity and decay. 2. The Product Life Cycle from the broader enterprise and user perspective is defined by phases of its overall life: concept, development, production, operation, support, and disposal. |
Product Lifecycle Management | Product Lifecycle Management (PLM) generally refers to the process of managing all data related to the product over its life cycle. It broadens the concept of product data management (PDM) to address managing the product configuration and the availability of product data into its later lifecycle stages of production, operation, support and disposal and to address managing the product’s related process data. ARC Advisory Group defines six components of PLM: innovation and portfolio management, project and program management, collaborative design, product data management, manufacturing process planning, and service and support management. Michigan’s PLM Development Consortium defines it as “an integrated, information driven approach to all aspects of a product’s life from its design inception, through its manufacture, deployment and maintenance and culminating in its removal from service and final disposal. PLM is the integration of business systems to manage a product’s lifecycle.” CIMdata Inc., says that PLM represents “a business approach to solving the problem of managing the complete set of product and plant definition information and the processes through which it passes. The PLM process includes creating and changing that information, managing it through its life and disseminating and using it throughout the lifecycle of the product.” |
Product Line | Product lines consist of similar products with different cost/feature variations for each product within the product line. |
Product Model | The Product Model is the entire product information database which describes the product completely and unambiguously. It contains two general types of information: physical product design information represented by the design model and process information, represented by the process and data model. |
Product Platform | A product platform is a set of subsystems and interfaces that form a common structure from which a stream of derivative products can be efficiently developed and produced. |
Product Requirement | A technical characteristic of the product expressed in the developer’s language to respond to a customer need. A good requirement should be 1) stated so that it is directly actionable by engineering, 2) is global and does pre-suppose a particular technical solution, and 3) is measurable so that it can be ultimately verified. The developer uses the product requirements to guide the design and building of the product. |
Product Roadmap | A tabular or graphic representation of product plans mapped against time to show the evolution of product models, their capabilities and the relationships to one another. Product roadmaps may be based on an individual product line or a product platform, and it may also show the relationship to supporting technologies. The product roadmap represents the long-term product plan to meet the needs of a defined market. |
Product Structure | Product Structure refers to the hierarchical bill of material structure which defines a product, its assemblies, component parts, materials, and other resources needed to produce the product and the way they fit together to form a product. |
Production Launch | Production Launch begins at the point that a product has been developed and is ready to begin production and proceeds through ramping-up production to full volume or general availability. It is the narrower set of Product Launch activities focusing on production and excluding marketing, support and service. See Product Launch. |
Production Part Approval Process | Production Part Approval Process (PPAP) is a Big Three automotive process that defines the generic requirements for approval of production parts, including production and bulk materials. Its purpose is to determine during an actual production run at the quoted production rates whether all customer engineering design record and specification requirements are properly understood by the supplier and that the process has the potential to produce product consistently meeting these requirements. |
Production Readiness Review | Production Readiness Review is a design review conducted prior to putting a product into production. This reviews assesses whether all needed product and process data has been completely generated, that the production process has been validated, and that the company is ready to begin production (either pilot production, low-rate initial production, production ramp-up, or full-rate production). |
Program | A group of related projects that are managed together. |
Program Evaluation and Review Technique | Program Evaluation and Review Technique (PERT) is an event-oriented network analysis technique used to estimate project duration when there is a high degree of uncertainty with individual activity duration estimates. PERT applies the Critical Path Method (CPM) to a weighted average duration estimate. |
Project Management | The management process, tools, and techniques used to define the project’s goal; plan, schedule, and budget all the work necessary to reach that goal; lead the project; monitor progress; and ensure that the project is completed in a satisfactory way. |
Project Office | A designated location where the administrative work of the project is conducted and the project management skills (resources) such as cost accounting, estimating, scheduling, training, etc. are retained. In the past, this was usually only economically possible on large projects. Recently some companies have established this type of facility to support a pool of smaller projects in an effort to develop and promote improved project management capabilities. |
Promotion | The act of moving a piece of product information from one state to another in a product data management system. |
Promotion Level | In a product data management system, information is assigned to promotion levels based on the approvals it has received. These are defined and named by the system administrator. Examples are Review, Preliminary Release, Prototype Release, and Production. Each promotion level has its own set of authorizations for access and approvals. |
Prototype | A physical model or representation of the new product concept or design. Depending upon the purpose, prototypes may be non-working models or representations, functionally working, or both functionally and geometrically complete and accurate. Prototypes (physical, electronic, digital, analytical, etc.) can be used for the purpose of, but not limited to: a) assessing the feasibility of a new or unfamiliar technology, b)assessing or mitigating technical risk, c) validating requirements, d) demonstrating critical features, e) qualifying a product, e) qualifying a process, f) characterizing performance or product features, or g) elucidating physical principles. |
PRR | see Production Readiness Review |
PSCM | Product Structure Configuration Management (STEP, ISO 10303) |
Psychological Inertia | Psychological Inertia is the tendency of persons to formulate opinions or attitudes, make decisions or seek known or familiar solutions to problems based on their current frame of reference, experience, and training. |
PTH | 1. see Plated-Thru-Hole 2. synonymous with Pin-Thru-Hole |
Pugh Concept Matrix | The Pugh Concept Matrix is used to a) evaluate multiple design concepts and select the preferred concept alternative and b) synthesize the best elements of other concepts into an improved concept (which may be a hybrid or variant of the best of other concepts. The Pugh matrix is useful because it does not require a great amount of quantitative data on the design concepts, which generally is not available at this point in the process. |
PWB/PWC | Printed Wiring Board/ Printed Wiring Circuit |
QA | Quality Assurance |
QFD | see Quality Function Deployment |
QFP | Quad Flat Pack – surface mount chip housing with flat leads on four sides. |
QPL | Qualified Parts List |
QS-9000 | QS-9000 is a quality management standard developed by the Big Three Automakers for the automotive sector. Currently largely replaced by Technical Specification 16949 (ISO/TS 16949). |
Qualification Testing | Testing performed to demonstrate that a product or system meets its specified requirements. |
Quality Function Deployment | A structured planning and decision-making methodology for capturing customer requirements (voice of the customer) and translating those requirements into product characteristics, part characteristics, process plans, and quality/process control requirements using a series of matrices. |
Quick-Turn Prototyping | Production on a quick turnaround basis of a small quantity of products that are used to prove the design. |
RAAR | Responsibility, Authority, Accountability and Resources |
RAD | see Rapid Application Development |
RAM | 1. Reliability, Availability and Maintainability 2. see Requirements Allocation Matrix |
RAMS | Reliability, Availability, Maintainability & Safety |
Random Function Determination | A value analysis methodology that list basic and secondary functions performed by a component or product in a verb-noun format. |
Rapid Application Development | Rapid Application Development (RAD) is a way of developing a system by completing an initial working part of the system, and then incrementally adding to it every few months. Instead of waiting to finish the entire system, the system owners can put the system into use earlier. Development tools such as visual programming and computer-assisted software engineering help with RAD. |
Rapid Prototyping | 1. Rapid Prototyping refers to various technologies such as stereolithography and selective laser sintering that can rapidly create parts for visualization, product mock-ups, or functional product prototypes or produce rapid tooling to manufacture small to medium volumes of parts. Rapid prototyping processes involve devices, ranging from office modelers to four-ton machines, that accept 3D CAD files, slice the data into cross-sections, and construct layers from the bottom up, bonding one on top of the other, to produce physical prototypes. 2. More generally,it is the process of quickly generating prototypes or mockups of what a product system will look like. Rapid prototyping may be done with paper prototypes such as sketches, low-fidelity physical prototypes, CAD visualization, rapid application development, or video prototyping. |
Rapid Manufacturing | Rapid Manufacturing refers to the use rapid prototyping technologies to directly manufacture low volumes of parts. |
Rapid Tooling | Rapid Tooling refers to the use rapid prototyping technologies to fabricate tooling in a much shorter period of time than conventional tooling. Rapid tooling technologies include methods such as RTV molds, high-speed milling, centrifugal casting, etc. |
RCCA | Root Cause and Corrective Action |
R-Chart | A statistical process control (SPC) chart that monitors the range (variability) of the process. A sample of parts is collected from the process periodically. The range (maximum minus minimum) of the sample is plotted on the control chart and a determination is made if the process is “under control” or not. |
R&D | Research and Development |
RDMP | Rapid Decision-Making Practices |
RDT&E | Research, Development, Testing and Evaluation |
Recurring Cost, Recurring Expense or Recurring Production Cost | The recurring cost of producing each unit a product. This would typically include direct materials, direct labor, direct process costs, allocated overhead, and any outside processing costs. The recurring cost is typically the basis for a target cost. Another term for these cost is Unit Production Cost (UPC). |
Register Transfer Level | Register Transfer Level (RTL) – a system definition described in terms of registers, switches (multiplexers), and operations. RTL design flow represents an advance in the EDA design process over gate-level design flow. |
Reliability | The probability that an item will continue to function at customer expectation levels at a measurement point, under specified environmental and duty cycle conditions. British Standard Institute BS4778: The ability of an item to perform a required function under stated conditions for a stated period of time. |
Reliability Analysis | A predictive tool used to estimate the “life” of a product. This is usually expressed in terms of hours as “mean time between failure” (MTBF). |
Reliability Prediction | Reliability Prediction is the analysis of parts and components in an effort to predict the rate at which an item will fail. A reliability prediction is one of the most common forms of reliability analyses. |
Requirement | A function, feature or capability that a product must provide or meet to satisfy the customer’s needs and enterprise’s objectives for a new product. See Product Requirement. |
Requirements Allocation Matrix | Requirements Allocation Matrix is a matrix showing the allocation of a requirement (e.g., reliability, weight, cost) to various subsystems or subassemblies so that requirement can be accurately flowed-down and the satisfaction of the overall requirement can be tracked and managed. |
Requirements Analysis | The determination of product-specific performance and functional characteristics based on analyses of: customer needs, expectations, , and constraints; operational concept; projected utilization environments for people, products, and processes; and measures of effectiveness. |
Requirements Creep | See Scope Creep |
Requirements Flowdown | The process of deriving and allocating requirements to all levels of system decomposition. |
Requirements Engineering | Requirements Engineering can be defined as the systematic process of developing requirements through the process of analyzing the problem or need, documenting the resulting requirements to solve the problem or meet the need in a variety of representation formats, and checking the accuracy of the understanding gained. Requirements Engineering focuses on “what” needs to be designed. Requirements Engineering is not a one time activity but instead should be revisited at every stage of the development process to find out if the requirements have changed at all and if not are they being met. |
Requirements Management | Requirements Management is the process of managing the initial development of requirements and the subsequent changes to requirements to assure that they address only what is needed or required of the product and that adequate consideration is given to tradeoffs in product cost, development cost, development schedule, and competitor actions. Requirements Management exercises control over the project scope to avoid scope creep and unnecessary or deferrable nice-to-have features/capabilities. |
Requirements Traceability | The evidence of an association between a requirement and its source, its implementation, and its verification. |
Return on Investment | Return on Investment is a financial analysis technique which compares the expected return to the outlay or investment to determine a percentage of return. |
Revenue Release | The point in the product life cycle that products are released for first sale to customers. Production is usually still at low rate, the customers that the product is sold to may need to meet certain requirements, and there may be special support capabilities provided to the product at this point. |
Reverse Engineering | 1. Reverse Engineering is the process of capturing the geometry of existing physical objects and then using the data obtained as a foundation for designing a duplicate of the original or an entirely new adaptation. Other terms include Digital Shape Sampling and Processing (DSSP), 3D Scanning, 3D Data Capture, and Optical Scanning. 2.Reverse engineering refers to the procedure of carefully dismantling and inspecting a competitor’s product to look for design features that can be incorporated into one’s own product. |
RFP/RFQ | Request for Proposal / Request for Quotation |
Risk Management | A management process consisting of identification, assessment, mitigation, and management of all project, technical and market risks using formal tools and methods. |
Risk Priority Number | Risk Priority Number (RPN) is used in FMEA analysis to rank the importance of different types of failure. RPN = Severity x Occurrence x Detection |
R&M | Reliability and Maintainability |
Robust Design | Design of the product in a manner to desensitize the product to variation including misuse and increase the probability that it will perform as intended. |
Robustness | The condition of a product or process where its operating parameters remain relatively stable with a minimum of variation even though factors which influence operation or usage , such as wear or environment, change. |
ROI | see Return on Investment |
Root Cause | A root cause is an antecedent source of a defect such that if it is removed, the defect is decreased or removed itself. |
Root Cause Analysis | Root Cause Analysis – Study of original reason for nonconformance with a process. When the root cause is removed or corrected, the nonconformance will be eliminated. |
Root Sum of Squares | Root Sum of Squares (RSS) is a tolerancing method that makes use of RSS to determine the best tolerance limits. RSS assumes that the print tolerance equals +/- 3 standard deviation limits and part nominal equals print nominal. This analysis exploits the manufacturing probability that a part is not always at its minimum or maximum value. It does not take into account process mean shifts (tool wear) and assumes the process is always centered. |
RP | See Rapid Prototyping |
RSM | Response Surface Methodology (Design of Experiments technique) |
RSS | see Root Sum of Squares (tolerancing method) |
RTL | see Register Transfer Level |
RTM | Requirements Traceability Matrix |
SA | Structured Analysis |
SAE | Society of Automotive Engineers |
SASD | see Structured Analysis and Structured Design |
SAVE | Society of American Value Engineers |
SBU | Strategic Business Unit |
SCM | Software Configuration Management |
Scope | The sum of products and services to be provided as part of a project. |
Scope Creep | The tendency for project requirements to grow over time, usually resulting in huge, unmanageable projects. As some projects progress, especially through development, requirements continuously change incrementally, causing the developer to add to the work scope with consequent increases in the time and budget required. Synonymous with requirements creep. |
SCPD | Society of Concurrent Product Development |
Screening | Screening is the process of evaluating and selecting new product ideas or concepts for development. These evaluation criteria include fit with company strategy, fit with other products/product lines, fit with customers and markets, profitability, growth, risk, investment requirements, technical capabilities, core competencies, etc. |
SCM | see Software Configuration Management |
S-Curve | A graphical representation of costs, hours, technological progress and other factors. The name is derived from the S-like shape of the curve that is flatter at the beginning, accelerates sharply, and then tails-off. As it relates to technology, the S-Curve is flat when the technology is first invented (technology performance improves slowly and incrementally). Then, as experience with a new technology accrues, the rate of performance increase grows and technology performance increases by leaps and bounds. Finally, some of the performance limits of a new technology start to be reached and performance growth slows. |
SD | Structured Design |
SDAI | STEP Data Access Interface (STEP Part 22) |
SDF | Standard Delay Format – an industry standard notation for electronic circuit delay/constraint data for use between EDA tools. |
SDP | see Software Development Plan |
SDR | System Design Review |
SDT | Self-Directed Team |
SDWT | Self-Directed Work Team |
SE | 1. Simultaneous Engineering (synonymous with concurrent engineering) 2. Software Engineering 3. Systems Engineering |
SEDS | Systems Engineering Detailed Schedule |
SEI | Software Engineering Institute (at Carnegie Mellon University). Developers of the Capability Maturity Model. |
Selective Laser Sintering | Selective Laser Sintering – a rapid prototyping technology that uses a laser to trace a pattern on a powdered material to fuse it into a solid on layer after another to form a solid object. |
SEMP | Systems Engineering Management Plan |
SEMS | Systems Engineering Master Schedule |
Serviceability | The characteristics of a product to make it more readily serviceable. These characteristics would address features related to fault identification, diagnosis, disassembly, repair, replacement, and re-assembly. A common serviceability metric is mean time to repair (MTTR). |
SET | Specifications du Standard D’Echange et de Transfert (French product data standard) |
SGML | Standard Generalized Mark-up Language (ISO 8879) |
Should Cost | The most likely cost of developing and producing a product/system in accordance with specifications, with only nominal design changes resulting from maturation rather than requirements changes, and with normal allowances for scrap and rework. |
SIA | Semiconductor Industry Association |
Signal to Noise Ratio | Signal to Noise Ratio as used in the design of experiments (DOE) is the mathematical equation that indicates the magnitude of an experimental effect divided by the experimental error due to chance fluctuations. This metric is used in Taguchi Methods, which is a form of DOE. |
Simulation | Modeling or representation of hardware, software or systems to determine or verify their behavior, operation or fit. Simulation is used to provide confidence that the hardware, software or systems will operate as intended without investing the time or expense of physically constructing the object to verify its operation. In electronic design, simulation is a technique in which the properties of a circuit are represented indirectly by test vectors. |
SIPOC Diagram | The SIPOC Diagram consider the Suppliers of a process, the Inputs to the process, the Process itself, the Outputs of the process, and the Customers that receive the process outputs. While the SIPOC Diagram is frequently used in process or value stream mapping and six sigma projects, it can be used to analyze customer processes to better understand the context of the customer’s process and how it relates to a product being used in that process (e.g., industrial equipment, office equipment, medical/surgical devices, etc.) |
SIT | System Integration Team |
Six Sigma | 1. Six Sigma is a statistical measurement of processes that produce less than 3.4 defects or mistakes per million opportunities (or 99.99966% good). 2. Six Sigma is a bottom-line oriented, data-driven quality program. It is based on achieving a level of quality which equates with only 3.4 defects per million opportunities for each product. The six sigma process includes five steps a) define, b) measure, c) analyze, d) improve, and e) control. |
Skunk Works | A collocated project environment to shorten communication paths and to keep functional contributors close to one another and to the prototype or production center. Skunk Works often have streamlined procedures and work processes that rely on informal communication and coordination to reduce bureaucratic overhead. |
SLA | Stereolithography Apparatus (see Stereolithography) |
SLS | see Selective Laser Sintering |
SM | see Solids Modeling |
SME | Society of Manufacturing Engineers |
SMT | see Surface Mount Technology |
SMTA | Surface Mount Technology Association |
S/N | Signal to Noise (e.g., Signal to Noise Ratio) |
S/N Ratio | see Signal to Noise Ratio |
Sneak Analysis | Analysis of of modes of operation unanticipated during design that result in unexpected system behavior and potential failure. |
Soft Systems Methodology | Soft Systems Methodology (SSM) assumes differing viewpoints between individuals regarding a problem. It tries to move toward consensual action between these conflicting views. SSM is a goal-driven, iterative process with a philosophy of continual improvement for which process is more important than the result. SSM requires a facilitator to provide an unbiased viewpoint and is indicated for use when the facing a complex, organizational problem. SSM is useful for providing a structure for understanding complex programs. |
Software Architecture | Software Architecture refers to the high-level structure of software systems. The architecture of a software system identifies a set of components that collaborate to achieve the system goals. The architecture specifies the “externally visible” properties of the component, i.e., those assumptions other components can make of a component, such as its provided services, performance characteristics, fault handling, shared resource usage, etc. It also specifies the relationships among the components and how they interact. |
Software Configuration Management | Software Configuration Management (SCM) is the specialization of Configuration Management (see Configuration Management) for software systems over their lifecycle. |
Software Development | A set of activities that results in software products. Software development may include new development, modification, reuse, re-engineering, maintenance, or any other activities that result in software products. |
Software Development Plan | A Software Development Plan (SDP) is a document describing a developer’s plans, process and methodology for conducting software development. |
Software Engineering | That field within computer science responsible for the establishment and use of sound engineering principles and methods in order to economically obtain reliable and functional software. |
Software Quality Assurance | The process, procedures and controls to ensure that software produced can be verified to meet the requirements and specifications and, ultimately, the user’s/customer’s needs. |
Solids Modeling | A geometric modeling method that completely and unambiguously describes both the exterior and interior of a part or assembly in three dimensions (geometry, topology and mass properties). |
SOLE | Society of Logistics Engineers |
Sourcing | The determination of sources from which goods and/or services may be obtained to meet the needs of a new product during development and production. |
SOW | see Statement of Work |
SPC | 1. Statistical Process Control 2. Software Productivity Consortium |
Special Causes (of variation) | Special Causes are causes of variation in output from a manufacturing process or system of procedures that is not due to the inherent operation of the process or system itself (common causes), but is due to the intrusion into the system of a one-time or external cause of variation. One-time or external causes do not spring from the system, and so are preventable – i.e., their occurrence can be prevented. Consequently, the reason for each special cause must be investigated and steps then taken to see that it does not occur again. The presence of a special cause of variation must be determined statistically. This is done by knowing that variation in output due to common causes follows a regular pattern corresponding to the Normal curve – i.e., with an average and a deviation on either side of the average within three standard deviations. Variation due to a special cause results in a performance outside these statistical limits. |
Special Characteristics | Product and process characteristics designated by the customer; governmental, regulatory or safety agencies; and/or the supplier through knowledge of the product or process. |
Specifications | 1. The document that prescribes the requirements with which the product or service has to conform. 2. As used with QFD, Specifications are the particular measures or metrics to define a product requirement. Synonymous with target value in this context. 3. Specifications are boundaries, usually set by management, engineering, or customers, within which a system must operate. They are sometimes called engineering tolerances. |
SPI | Solder Paste Inspection |
Spiral Development Model | The Spiral Development Model combines the Waterfall Development Model (see Waterfall Development Model) and the prototype approach. It consists of a series of partial implementations or releases of the product. This approach is useful when the risks are significant, there is a need/opportunity to field a partial system in a short amount of time, and the requirements are not completely understood or can change over time. Key assumptions with the Spiral Development Model are a) the initial release is sufficient to key system stakeholders that they will continue to participate in its evolution; b) the architecture of the initial release is scalable to accommodate the full set of system life cycle requirements; c) and the users/customers are sufficiently flexible to adapt to the pace of system evolution. |
SQA | see Software Quality Assurance |
SPICE | Simulation Program with Integrated Circuit Emphasis – One of the most widely used analog circuit simulation programs. |
SRR | see System Requirements Review |
SSAD | Structured Systems Analysis and Design (See Structured Systems Analysis) |
SSM | see Soft Systems Methodology |
Stage | A portion or phase of the product development process with a clear objective of milestone that ends with a stage-gate review before authority is granted to proceed with the next stage or phase. |
Stage-Gate™ | Stage-Gates™ or phase gates refer to management reviews or decision gates that are structured at key points in the development process (typically at the end of one stage/phase or before the start of the next development stage/phase) to review the opportunity/development effort, assess it from a business perspective and determine whether it is worthy to continue development or to kill the project. |
Stage-Gate™ Process | A widely employed product development process that divides the development effort into distinct time-sequenced stages or phases separated by management decision gates. Product teams must successfully complete a prescribed set of related activities in each stage prior to obtaining management approval to proceed to the next stage of product development. The framework of the Stage-Gate™ process includes work-flow and decision-flow paths and defines the supporting systems and practices necessary to ensure the process’s ongoing smooth operation. |
Standard Cost | Standard Cost is the predetermined or planned cost of manufacturing a single unit or of providing a single unit of service. It represents a goal or baseline that is used to project cost, based on experience and/or analysis. |
Standardization | Standardization or parts, materials, modules and assemblies makes possible the interchangeability of these items among products, resulting in higher volume production and purchasing, lower investment in inventory, easier purchasing and material handling, fewer quality inspections, and less difficulties in production. |
Statement of Work | Statement of Work is a narrative description of products and services to be supplied under contract or as part of a project. |
STE | Special Test Equipment |
STEP | Standard for the Exchange of Product Model Data (ISO 10303) – An international product data standard to provide an complete, unambiguous, computer-interpretable definition of the physical and functional characteristics of a product throughout its life cycle. |
Stereolithography | A rapid prototyping (RP) process, introduced in 1987 by 3D Systems Inc. which launched the RP industry. A Stereolithography Apparatus (SLA) machine builds physical models in this manner: it focuses an ultraviolet (UV) light onto the surface of a vat filled with liquid photopolymer. The light beam, moving under computer control, draws each layer of an object onto the surface of the liquid. Wherever the beam strikes the surface, liquid changes to solid. 3D parts are built from the bottom up, one layer at a time; when the part is finished, it is exposed to UV light for curing. |
Structured Analysis and Structured Design | Structured Analysis and Structured Design (SASD) is composed of two parts – Structured Analysis and Structured Design. Structured Analysis is composed of an Essential Model, an Environmental Model, a Behavioural Model and lastly an Implementation model. The Essential Model is a model of what the system must do, the Environmental model defines the scope and interaction between the system and the world. The Behavioural Model specifies the required behaviour of the system so that it can interact with it?s environment. Lastly, the implementation model implements the system. The Structured Design section is divided into three levels. The Processor Model assigns processes to processes. The Task Model assigns processes and data to tasks. Lastly, the Program Implementation Model is an internal definition of individual tasks. Structured Design breaks up the program into a hierarchy of modules with a computer program as the result. |
Structured Systems Analysis | Structured Systems Analysis uses process and data perspective to analyze, develop and document the requirements of a system. Structured Systems Analysis uses dataflow diagrams, entity relationship diagrams, data dictionaries to communicate with designers and describe the requirements. |
Substance Field Analysis | Substance Field Analysis (or Su-Field Analysis) is a TRIZ methodology used to model a system in terms of substances or objects which interact through field such as a force. According to the model, a problem is viewed as incomplete or harmful and can be solved by correcting the model and applying the analogous correction to the system. |
Supplier Certification | A supplier becomes “certified” when it has delivered parts with perfect quality over a pre-specified time period (say six months). At that point, inspection is no longer needed. |
Supplier Qualification | A supplier is “qualified” when a customer when it has been determined that the supplier is capable of providing a part. |
Supplier Roadmap | Technology roadmaps of the suppliers current and future product and process technology capabilities. These are typically represented in tabular or graphic form over time to aid in the selection of the appropriate product or process technology for a new product. |
Supply Chain Management | The procurement, stocking and distribution of components, subassemblies and products throughout the design, manufacturing, and distribution stages, ensuring that the correct components, subassemblies and products are delivered to their appropriate destination at the proper time, the lowest overall cost, and acceptable quality levels. |
Surface Modeling | A 3D modeling technique to describe geometry by its surfaces. This is typically used where surface shape is critical such as the design of auto body panels and aerostructures and industrial design. |
Surface Mount Technology | Surface Mount Technology (SMT) is a method of attaching electrical components directly to a board substrate rather than through a plated hole. |
SWOT Analysis | Strengths, Weaknesses, Opportunities and Threats Analysis – a process where by a group of people determine: a) what strengths do we have? (how can we take advantage of them?); b) what weaknesses do we have? (how can we minimize them?); c) what opportunities are there? (how can we capitalize on them?); d) what threats might prevent us from getting there? (consider technical obstacles, competitive responses, values of people within the organization, etc.). For every obstacle identified, what can we do to overcome or get around it? (This helps to develop contingency plans.) |
Synthesis | Synthesis is an EDA process which reads a high-level electronic design description and implements it at a lower level of abstraction. Legacy synthesis tools produce a gate-level implementation, at which point the design netlist is handed off to the IC layout process. More recent developments have synthesis becoming more tightly integrated with the IC layout process in order to better achieve convergence of goals such as timing. |
System Design | The process of designing a system that comprises the interaction and integration of subsystems and subassemblies into a single system that performs an intended function. The sub-assemblies can consist of electrical, mechanical, optical, software, and other components to achieve overall functionality. |
Systems Engineering | Systems engineering is the process of specifying the system requirements, allocating the system requirements to the hardware and software components, specifying the interfaces between the hardware and software components, and monitoring the design and development of these components to ensure conformance with their specifications. Systems engineering transforms an operational need into a description of system performance parameters and a system configuration through the use of an iterative process (e.g., definition, syntheses, analysis, design, test and evaluation, etc.); integrates related technical parameters and assure compatibility of all physical, functional, and program interfaces in a manner which optimizes the total system definition and design; and integrates reliability, maintainability, safety, human, and other such factors into the total engineering effort. |
System Integration | The successive combining and testing of hardware and software system components in a prescribed manner to prove compatibility and performance. |
System Integration Team | A System Integration Team is a higher-level IPT (see Integrated Product Team) used in a larger program which flows down requirements and workscope in individual IPT’s, monitors and coordinates their activities from a technical perspective, resolves interface and integration issues, and redirects technical activities when required to assure that the development work is accomplished to meet the overall system requirements. |
System Requirements Review | System Requirements Review (SRR) is a design review at which the system requirements document is reviewed and approved. This review determines which needs of the total user requirements statement will be satisfied by the proposed project. |
Synthesis | (Digital Circuits) Translation and optimization of an hardware description language specification into a gate-level implementation. |
TAAF | Test, Analyze and Fix |
TAB | see Tape Automated Bonding |
TAP | Test Access Port |
Tape Automated Bonding | Tape Automated Bonding – component packaging technology where special lead frames are used for interconnecting and carrying an integrated circuit for later attachment to a PWB. |
Taguchi Methods | A quality engineering methodology developed by Genichi Taguchi that includes off-line quality control, on-line quality control, and system of experimental design to reduce costs and improve quality. Taguchi methods are not just a statistical application of design of experiments. Taguchi methods include the integration of statistical design of experiments into a powerful engineering process. The goal is not just to optimize an arbitrary objective function, but also to reduce the sensitivity of engineering designs to uncontrollable factors or noise. This moves design targets toward the middle of the design space so that external variation affects the behavior of the design as little as possible. This permits large reductions in both part and assembly tolerances, which are major drivers of manufacturing cost. Also see Design of Experiments. |
Target Costing | A market-driven strategy and process that begins with what price a product can sell for in the marketplace to achieve a desired sales volumes. Target cost is then calculated by subtracting the desired profit margin from this target price. The target cost is treated as an independent variable that must be satisfied along with other customer requirements rather than the result of design decisions (dependent variable). This cost would be considered the unit production cost that is expected to be achieved during a mature production stage. Depending on the definition, it may or may not include warranty costs and selling, general and administrative costs. |
TCE | Thermal Coefficient of Expansion |
TCP/IP | Transport Control Protocol/Internet Protocol |
TCT | see Time Compression Technologies |
TDI | Technical Data Interchange |
TDP | Technical Data Package |
Team | A Team is a small number of people with complementary skills who are committed to a common purpose, performance goals, and approach for which they hold themselves mutually accountable. Characteristics of high-performing teams include: a shared, elevating vision or goal, a sense of team identity, a results-driven structure, competent team members, a commitment to the team, mutual trust, interdependence among team members, effective communication, a sense of autonomy, a sense of empowerment, small team size, and a high level of enjoyment |
Teambuilding | The process of influencing a group of diverse individuals, each with their own goals, needs, and perspectives, to work together effectively for the good of the project such that their team will accomplish more than the sum of their individual efforts could otherwise achieve. |
Team Charter | A Team Charter is a brief written document used to define the mission and objectives of the team. The charter typically includes a statement of mission, objectives or statement of work; background; authority, boundary conditions (scope, constraints, resources, and schedule); membership; high-level requirements or specifications, and interface responsibilities. |
Technology Roadmap | A tabular or graphic representation of technology plans mapped against time to guide the selection and use of technology in new product development or represent the technology embodied in future products. |
Technology Transfer | Technology Transfer is the process of transferring research and technology from laboratories, government and outside organizations into the enterprise for practical application in new products. |
Testability | The characteristic of a product’s design that facilitates it’s testing during development/qualification, in production, and in the field. |
Test Plan | A Test Plan identifies the test objectives and details the activities required to achieve these test objectives. |
Test Requirement | The stimulus, measurement, power, loads and any special test equipment or procedure essential to validate proper operation of a device or some predetermined design control or product specification definition. |
T&E | see Test and Evaluation |
Theory of Inventive Problem Solving | Theory of Inventive Problem Solving (Russian acronym is TRIZ) is a structured methodology developed by Genrich Altshuller for problem solving and innovation based on analysis and codification of technology solutions from millions of patents. |
Time Box or Time-Boxing | Time box or time-boxing refers to a technique for setting interim or end-date goal for a project and the project scope (e.g., list of features in priority order to account for the time available), approach, and plan for achieving the deadline. |
Time Compression Technologies | Time Compression Technologies – technologies to support the product development process, that when effectively integrated into the process, offers opportunity for significant reductions in cycle time. These include CAD, CAE, CAM, PDM and rapid prototyping. |
Time-to-Market | 1. Time-to-Market is the cycle time of product development from conception of a new product to initial sale of the new product. 2. Time-to-Market is the dimension of strategy focused on getting products to market quickly as the basis of competition. |
TIPS | see Theory of Inventive Problem Solving |
TL 9000 | TL 9000 is a quality management standard for the telecommunications industry built on ISO 9000. Its purpose is to define the requirements for the design, development, production, delivery, installation and maintenance of products and services. Included are cost and performance based measurements that measure reliability and quality performance of the products and services. |
TLM | Tape Layering Machine |
TM | 1. see Taguchi Methods 2. Technical Manual |
Tolerance | Tolerance is the upper and lower limits of some dimension or parameter relating to a component part, material or assembly which an actual item must comply with in order for it to be acceptable in procurement or manufacturing. The difference between the upper and lower tolerance is the tolerance spread. |
Tolerance Design | Tolerance Design is a step in the design process (following parameter design) where the determination is made of how much variation is acceptable with a design parameter that will still allow the satisfactory functioning of the product to meet the customer’s needs. Often tolerance design is not adequately considered, and the designer merely specifies standard tolerances which may be inadequate or overstated. |
Top-Down Design | Top-Down Design is a design methodology whereby an entire design is decomposed into its major components, and then these components are further decomposed into their major components, etc. The constraints are established early in the design flow, and then are passed on and adhered to by the back-end processes. |
Trade-off Analysis | Trade-off Analysis is the process of making decisions when each choice has both advantages and disadvantages. In a simple tradeoff, it may be enough to list each alternative and the pros and cons. For more complicated decisions, list the decision criteria and weight them. Determine how each option rates on each of the decision score and compute a weighted total score for each option. The option with the best score is the preferred option. Decision trees may be used when options have uncertain outcomes. |
TRSL | Test Requirements Specification Language (proposed IEEE standard) |
TQM | Total Quality Management |
TRIZ | Russian acronym for Theory of Inventive Problem Solving (see Theory of Inventive Problem Solving) |
TRR | Test Readiness Review |
Test and Evaluation | Measurement and evaluation of system performance to validate that the system meets its specifications. Test and Evaluation often occurs in the field under actual operating conditions with actual users rather than in a simulated laboratory environment. |
Test Plan | A document that describes the approach and test steps for all developmental, integration, qualification/certification, factory acceptance testing, and customer acceptance testing. |
Tolerance Analysis | Tolerance Analysis – An analysis of the dimensional tolerances of manufactured parts. In general, the component tolerances are all known or specified and the resulting assembly tolerance is calculated. Tolerance analysis methods include worst case, root sum of squares, statistical Monte Carlo analysis and other techniques. Analysis can be done in one, two or three dimensions. Economic compromises can also considered. |
TTD | Technology-Transparent Design |
TTM | see Time-to-Market |
UCD | see User Centered Design |
UCL | Upper Control Limit is the upper limit used within statistical process control that define the constraints of common cause variations. When a parameter value falls above the upper control limit, it flags the occurrence of special causes contributing to variation. |
ULCE | Unified Life Cycle Engineering |
Unit Testing | Testing of individual hardware or software units or groups of related units. |
Universal Design | Universal design is a design approach whereby designers insure that their products and services address the needs of the widest possible audience including groups such as the aged or disabled that many products otherwise would not be suitable. Synonymous with inclusive design. See inclusive design for the seven principles of inclusive design. |
URL | Universal Resource Locator – A series of letters or numbers that acts as an address for a world wide web (WWW) site. |
Usability | Usability is the effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment. |
Usability Testing | Usability Testing focuses on understanding the user’s experience with a product or process and gathering user feedback to improve product design. The immediate result of usability testing is a list of specific and general recommendations for improving the software/hardware, documentation, training and/or other collateral materials provided to end users. The longer-term benefit is a better understanding of how to design more usable and marketable products by increasing and managing user input throughout the product development cycle. |
Use Case | A use case defines a goal-oriented set of interactions between external users and the system or product under development. Use cases capture who (users) does what (interactions) with the system, for what purpose (goal). A complete set of use cases specifies all the different ways to use the system, and thus defines all behavior required of the system without dealing with the internal structure of the system. |
User Centered Design | User Centered Design (UCD) places the user at the center of the design process. UCD incorporates a whole range of user-centered techniques, including techniques for undertaking cost benefit analyses from an organization and user perspective, requirements capture and analysis, task analysis, dialogue specification and usability evaluation. |
USL | Upper Specification Limit is the upper limit for a parameter value in order to meet specifications. |
US PRO | U. S. Product Data Association |
UUT | Unit Under Test |
VA | See Value Analysis |
Validation | Validation is the process of ensuring that the product conforms to defined user needs, requirements, and/or specifications under defined operating conditions. Design validation is performed on the final product design with parts that meet design intent. Production validation is performed on the final product design with parts that meet design intent produced with production processes intended for normal production. |
Value Analysis | Value Analysis – an effort to analyze systems and designs to satisfy needed user requirements at sufficient quality (functions) at an optimum cost (maximize value). |
Value Engineering | 1. Value Engineering is a structured methodology for applying value analysis or function analysis to increase customer or user value. 2. A formal technique to eliminate, without impairing essential functions or characteristics, anything that unnecessarily increases the cost of a product. It is a disciplined system for accomplishing the functions that the customer needs and wants at the lowest cost. |
Variability Reduction | A multi-part strategy to reduce product variation and make a product more robust or fit to use through design of experiments, design within process capabilities, and process improvement. |
Variational Geometry | A capability of 2D and 3D modeling systems in which the user defines a model by dimensions and constraints, which are then solved by a series of simultaneous equations to create and modify geometry. |
Vault | A product data management (PDM) system data storage areas or databases. Information stored in PDM system vaults is controlled by system rules and processes. |
VDA | Verband der Automobilindustrie (German product data exchange standard) |
VDSM | Very Deep Sub-Micron design – relates to the design of integrated circuits with feature sizes less than .25um. |
VE | see Value Engineering |
VECP | Value Engineering Change Proposal |
Verification | 1. Confirmation by examination and provision of objective evidence that specified requirements have been fulfilled. 2. Verification is the process of evaluating a system or component to determine whether the products of a given phase satisfy the conditions imposed at the start of that phase. (IEEE) 3. The process of verifying the functional and performance requirements of a design, be it a chip, board, or system. Many different kinds of verification tools are in use today, including simulation, formal verification, various types of physical analysis tools, emulation, and rapid prototyping. Most design verification strategies employ many or all of these approaches to assure the reliability of the final product prior to its manufacture. |
Verilog | Hardware description language similar to VHDL (IEEE Standard 1364) |
Version | The version of an object or product structure is used to distinguish between the changes made to the different object or structure as it changes during its lifecycle |
VHDL | VHSIC Hardware Description Language (IEEE Standard 1076-1987, ANSI Standard 1076-1988) – A computer language that provides designers with ability to model computer-simulatable descriptions of digital electronics, to communicate logical and physical interconnection between the models created, and to exchange the resulting digital electronic product data among different organizations. see Hardware Description Language. |
VHDL-A | VHSIC Hardware Description Language-Analog |
VHSIC | Very High Speed Integrated Circuit |
Virtual Customer | The term Virtual Customer refers to the use of technology and, more-specifically, web-based tools to gather customer input and feedback throughout the product development process to better understand and address customer needs. |
Virtual Prototyping | Virtual Prototyping refers to the use of numerical analysis tools to analyze a design instead of building and testing a physical prototype. |
Virtual Reality | Technology that enables users to “enter” and navigate through a computer-generated 3D environment. It allows users to change their viewpoint and interact with objects created in the environment in a way that mimics the real world. |
VITAL | VHDL Initiative Toward ASIC Libraries (IEEE 1076) – standards for back annotation, timing, and high-performance primitives for the purpose of speeding the introduction of ASIC libraries. |
VLSI | Very Large Scale Integration |
VOC | see Voice of the Customer |
Voice of the Customer | Customer input and feedback, both positive and negative, including needs, wants, comparisons, relative importance, likes, dislikes, problems, and suggestions. The VOC is used to drive the product definition and support techniques such as QFD. |
VPD | Virtual Product Development |
VR | 1. See Virtual Reality 2. See Variability Reduction |
VRML | Virtual Reality Modeling Language – a language for viewing and interacting with 3D models. |
VRP | Variability Reduction Process |
Waterfall | A name given to a particular form of presentation of the project life cycle. Rather than being broken into distinct periods of controlled phases, all activities appear as one long hierarchical succession. |
Waterfall Development Model | Waterfall Development Model undertakes the development of the entire system in a series of development phases and activities. This approach assumes the following: a) the requirements are knowable in advance of implementation; b) the requirements have no unresolved, high-risk implications; c) the nature of the requirements will not change very much during development; d) the right architecture for implementing the requirements is well understood; and e) there is enough calendar time to proceed sequentially. The projects using the Waterfall Development Model are checked for proper execution and quality through validation of entry requirements and exit criteria at each phase. This model contrasts with the Spiral Development Model (see Spiral Development Model). |
WAVES | IEEE test language which provides a standard representation for stimulus and response data in support of the design and test of digital devices. |
WBS | see Work Breakdown Structure |
Weibull Distribution | A failure distribution that is very useful in reliability activities because it can be used to model many other life distributions. By adjusting the beta factor, or shape parameter, of the Weibull distribution, it can be made to model a decreasing, constant, or increasing hazard rate. The Weibull distribution provides reasonably accurate failure analysis and failure forecasts with extremely small samples. |
Work Breakdown Structure | Work Breakdown Structure is a hierarchical tree structure decomposing a project into activities and sub-activities to help define and control the project and its elements of work. |
Workflow Systems / Workflow Management Systems | Workflow Systems are systems to support the coordination, communication and control of business processes by means of information technology for the purpose of improving and better managing these processes. Workflow Systems automate a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. |
Wireframe | A geometric model that describes 3D geometry by outlining its edges, similar to a “stick figure”. |
Worst Case Tolerance Analysis | Worst Case Tolerance Analysis – The assembly tolerance is determined by summing the component tolerances linearly. Each component dimension is assumed to be at its maximum or minimum limit, resulting in the worst possible assembly limits. It is a very conservative approach to tolerance analysis and is not the best approach to tolerancing since that it caters to combinations that are extremely unlikely, rather than focusing on a more probabilistic approach. |
X-Bar Chart | X-Bar Chart – A quality control chart that monitors the mean of the process. A sample of n parts is collected from the process every so many parts or time periods. The mean of the sample is plotted on the control chart and a determination is made if the process is “under control” or not. |
XP | see Extreme Programming |
2D | Two Dimensional |
3D | Three Dimensional |
hello there and thank you on your information – I’ve certainly picked up something new from proper here. I did then again experience several technical points using this web site, as I experienced to reload the site many times prior to I could get it to load correctly. I were brooding about if your web hosting is OK? Now not that I’m complaining, but sluggish loading circumstances times will often impact your placement in google and could harm your high-quality ranking if advertising and ***********|advertising|advertising|advertising and *********** with Adwords. Anyway I am adding this RSS to my e-mail and could look out for a lot extra of your respective interesting content. Make sure you replace this once more soon..