Skip to content

Judicial Watch • 2453_Resp Recs 1 Full Production

2453_Resp Recs 1 Full Production

2453_Resp Recs 1 Full Production

Page 1: 2453_Resp Recs 1 Full Production

Category:FOIA Response

Number of Pages:96

Date Created:August 5, 2014

Date Uploaded to the Library:September 11, 2014

Tags:Healthcare.gov, 2453, obamacare, HHS, IRS


File Scanned for Malware

Donate now to keep these documents public!

  • demand_answers

See Generated Text   ˅

Autogenerated text from PDF

3;
 

CENTERS FOR MEDICARE MEDICAID SERVICES 
OFFICE INFORMATION SERVICES 
7500 Security Boulevard Baltimore, 21244-1850 
Federal Data Services Hub (DSH) Security
 Controls Assessment Test Plan
 
August 20, 2013 
FINAL
 
06#3;6(16,7,9(#3;,1)250$7,21#3;#3;5(48,5(6#3;63(,$/#3;+$1'/,1*
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Table Contents
 
1#3; Introduction ........................................................................................................................... 1#3;
 1.1#3; Purpose......................................................................................................................... 1#3;
 1.2#3; Security Controls Assessment Background................................................................... 1#3;
 1.3#3; Assessment Process and Methodology.......................................................................... 2#3;
 1.3.1#3; Phase Planning.............................................................................................. 2#3;
 1.3.2#3; Phase Assessment ......................................................................................... 2#3;
 1.3.3#3; Phase Reporting ............................................................................................ 3#3;
 2#3; Planning ................................................................................................................................ 4#3;
 2.1#3; Federal Data Services Hub Background........................................................................ 4#3;
 2.1.1#3; Overview the Marketplace Information Technology (IT) Systems................. 4#3;
 2.1.2#3; Federal Data Services Hub ................................................................................ 4#3;
 2.1.3#3; Description the Business Process .................................................................. 5#3;
 2.2#3; Assessment Scope ........................................................................................................ 8#3;
 2.3#3; Assessment Assumptions/Limitations ......................................................................... 10#3;
 2.4#3; Data Use Agreement................................................................................................... 10#3;
 2.5#3; Roles and Responsibilities.......................................................................................... 11#3;
 2.5.1#3; Application Developer/Maintainer .................................................................. 11#3;
 2.5.2#3; Business Owner .............................................................................................. 11#3;
 2.5.3#3; CMS Facilitator............................................................................................... 11#3;
 2.5.4#3; CMS Government Task Lead.......................................................................... 12#3;
 2.5.5#3; Configuration Manager ................................................................................... 12#3;
 2.5.6#3; Contingency Planning Manager....................................................................... 12#3;
 2.5.7#3; Database Administrator................................................................................... 12#3;
 2.5.8#3; Information System Security Officer System Security Officer ..................... 13#3;
 2.5.9#3; Lead Evaluator................................................................................................ 13#3;
 2.5.10#3; Program Manager ............................................................................................ 13#3;
 2.5.11#3; System Administrator...................................................................................... 13#3;
 2.5.12#3; System Owner................................................................................................. 14#3;
 2.6#3; Assessment Responsibility Assignment ...................................................................... 14#3;
 2.7#3; PhysicalAccess and Work Area Requirements........................................................... 15#3;
 3#3; Assessment.......................................................................................................................... 16#3;
 3.1#3; Information Collection................................................................................................ 16#3;
 3.1.1#3; CMS FISMA Controls Tracking System (CFACTS) Name............................. 16#3;
 3.1.2#3; Documentation Requirements ......................................................................... 16#3;
 3.1.3#3; Script Output and Device Running Configuration Requirements ..................... 20#3;
 3.1.4#3; Application Testing Requirements .................................................................. 20#3;
 3.2#3; Enumeration ............................................................................................................... 21#3;
 3.2.1#3; Documentation Review................................................................................... 21#3;
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
3.2.2#3; Vulnerability Assessment Tools ...................................................................... 21#3;
 3.3#3; Testing and Review.................................................................................................... 23#3;
 3.3.1#3; Interviews ....................................................................................................... 24#3;
 3.3.2#3; Observances .................................................................................................... 24#3;
 3.3.3#3; Configuration Review..................................................................................... 24#3;
 3.3.4#3; Application Testing......................................................................................... 25#3;
 3.3.5#3; Database Server/Instance Testing.................................................................... 25#3;
 4#3; Reporting ............................................................................................................................ 26#3;
 4.1#3; Security Controls Assessment Findings Spreadsheet................................................... 26#3;
 4.1.1#3; Row Number................................................................................................... 27#3;
 4.1.2#3; Weakness ........................................................................................................ 27#3;
 4.1.3#3; Risk Level....................................................................................................... 27#3;
 4.1.4#3; CMSR Security Control Family and Reference ............................................... 28#3;
 4.1.5#3; Affected Systems ............................................................................................ 28#3;
 4.1.6#3; Ease-of-Fix ..................................................................................................... 28#3;
 4.1.7#3; Estimated WorkEffort.................................................................................... 29#3;
 4.1.8#3; Finding ........................................................................................................... 29#3;
 4.1.9#3; Failed Test Description ................................................................................... 29#3;
 4.1.10#3; Actual Test Results ......................................................................................... 29#3;
 4.1.11#3; Recommended Corrective Actions .................................................................. 29#3;
 4.1.12#3; Status .............................................................................................................. 29#3;
 4.2#3; Reassignment Findings ........................................................................................... 30#3;
 4.3#3; Reporting Observations .............................................................................................. 30#3;
 4.4#3; Reporting SQL Injection and Cross Site Scripting Vulnerabilities .......................... 31#3;
 4.5#3; Test Reporting............................................................................................................ 31#3;
 5#3; Logistics .............................................................................................................................. 32#3;
 5.1#3; Points Contact........................................................................................................ 32#3;
 5.2#3; Technical Staff Requirements..................................................................................... 33#3;
 5.3#3; Onsite Schedule.......................................................................................................... 34#3;
 5.4#3; Assessment Estimated Timeline.................................................................................. 34#3;
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

List Tables
 
Table Assessment Responsibilities ........................................................................................ 14#3;
 Table Mandatory Pre-Assessment Documentation................................................................. 17#3;
 Table Documentation Required Policy.............................................................................. 17#3;
 Table Expected/Supporting Documentation........................................................................... 19#3;
 Table Additional Documentation........................................................................................... 20#3;
 Table Application Roles ........................................................................................................ 20#3;
 Table Findings Spreadsheet................................................................................................... 27#3;
 Table Risk Definitions ........................................................................................................... 28#3;
 Table Definition Ease-of-Fix Rating.................................................................................. 28#3;
 Table 10. Definition Estimated Work Effort Rating............................................................... 29#3;
 Table 11. MITRE Evaluation Team Points Contact............................................................... 32#3;
 Table 12. CMS Points Contact.............................................................................................. 32#3;
 Table 13. Vendor Points Contact........................................................................................... 32#3;
 Table 14. MITRE Onsite Schedule............................................................................................ 34#3;
 Table 15. Estimated Timeline for Assessment Actions and Milestones...................................... 34#3;
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

List Figures 
Figure Federal Data Services Hub Concept .............................................................................5#3;
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; INTRODUCTION
 
1.1 PURPOSE
 
This document describes the security controls assessment (SCA) methodology, schedule, and requirements that The MITRE Corporation (MITRE) will use evaluate theData Service Hub (DSH) major application. The goal the SCA test plan explain clearly the information MITRE expects obtain prior the assessment, the areas that will examined, and the proposed scheduled activities MITRE expects perform during the assessment. This document meant used the Centers for Medicare Medicaid Services (CMS) and Quality Software Services, Inc (QSSI) technical managers, engineers, and system administrators responsible for system operations. 

1.2 SECURITY CONTROLS ASSESSMENT BACKGROUND 
MITRE operates federally funded research and development center (FFRDC) providing services the government accordance with the provisions and limitations defined the Federal Acquisition Regulation (FAR) part 35.017. According this regulation, order for FFRDC discharge its responsibilities the sponsoring agency, must have access government and supplier data (e.g., sensitive and proprietary data) and employees and facilities beyond that which common the normal contractual relationship. FFRDC agent, MITRE required conduct its business manner befitting its special relationship with the government, operate the public interest with objectivity and independence, free from organizational conflicts interest, and have full disclosure its affairs the sponsoring agency. 
MITRE tasked CMS perform comprehensive scope SCA accordance with the CMS Information Security (IS) Authorization Operate Package Guide, v2.01 forDSH Major 

The SCA complies with federal standards, policies, and procedures including the Federal Information Security Management Act 2002 (FISMA) and the security-related areas established and specified the National Institute Standards and Technology (NIST) Special Publication (SP) 800-53 Rev. Recommended Security Controls for Federal Information Systems and Organizations2 and the mandatory, non-waiverable Federal Information Processing Standards (FIPS) 200, Minimum Security Requirements for Federal Information and Information Systems.3 comply with the federal standards, agencies must first determine the security category their information system accordance with the provisions FIPS 199, Standards for Security Categorization Federal Information and Information Systems,4 and then apply the appropriate set minimum (baseline) security controls compliance with the NIST 800-53. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Furthermore, CMS developed and published the Information Security (IS) Acceptable Risk Safeguards (ARS) including CMS Minimum Security Requirements (CMSR) Version 1.5,5 CMS Policy for Information Security Program (PISP),6 Business Partners Systems Security Manual Version 10.0 (BPSSM),7 and CMS Technical Reference Architecture (TRA) Version 2.1.8 The CMS ARS CMSR contains broad set required security standards based upon NIST 80053 and NIST 800-63, Electronic Authentication Guideline9 well additional standards based CMS policies, procedures and guidance, other federal and non-federal guidance resources, and industry best practices. protect CMS information and CMS information systems, the controls outlined these policies must implemented. http://www.cms.gov/Research-Statistics-Data-and-Systems/CMS-Information
Technology/InformationSecurity/Downloads/ATO_Package_Guide.pdf http://csrc.nist.gov/publications/nistpubs/800-53-Rev3/sp800-53-rev3-final_updated-errata_05-01-2010.pdf. http://csrc.nist.gov/publications/fips/fips200/FIPS-200-final-march.pdf. http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.pdf. ARS CMSR Version 1.5 (July 31, 2012) https://www.cms.gov/Research-Statistics-Data-and-Systems/CMS
Information-Technology/InformationSecurity/Information-Security-Library.html. http://www.cms.hhs.gov/informationsecurity/downloads/PISP.pdf http://www.cms.gov/manuals/downloads/117_systems_security.pdf (July 17, 2009). TRA and Supplements can found CMS internal website:#3; 
http://cmsnet.cms.hhs.gov/hpages/oisnew/foffice/m/TRA.html (November 19, 2010). http://csrc.nist.gov/publications/nistpubs/800-63-1/SP-800-63-1.pdf. 

1.3 ASSESSMENT PROCESS AND METHODOLOGY 
This section outlines MITREs assessment methodology verify and validate that the management, operational, and technical controls are appropriately implemented. 
1.3.1 Phase Planning 
The first phase, Planning, defines the assessments scope, identifies goals, sets boundaries, and identifies assessment activities. This phase, well subsequent phases, requires the coordination all involved parties, including CMS, MITRE, and QSSI. During this phase, the MITRE Evaluation Team will review all security policies and procedures accordance with CMS security requirements previously noted. The team will then create assessment scenarios and premises and define agreeable assessment terms approved CMS. 

1.3.2 Phase Assessment 
Phase may have several steps depending the assessments objectives, scope, and goals set forth the Planning Phase. These steps can grouped the nature the activities involved. These activity groups are follows: 
	 
Information Collectionthorough research that must performed against the target system/application before any meaningful assessment can conducted. Data gathered analyzed the assessment proceeds and when the assessment complete. 

	 
Enumerationactivities that provide specific information about assessment targets. This information often collected using appropriate software tools. 

	 
Testing and Reviewactivities that typically involve both the automated testing security vulnerabilities via software tools, manual analysis, and the evaluation particular aspects the organizations security policies and practices the MITRE Evaluation Team members. MITREs evaluation goal apply experience and insight order determine 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
whether the system adequately implements security controls defined CMS policies and 
standards. 

1.3.3 Phase Reporting 
Phase Reporting, documents the soundness the implemented security controls and consolidates all findings into the final output. This output includes reports that provide summary key findings and actionable recommendations, well provisions for all information derived from the assessment. 
Depending the results these activities, may necessary repeat appropriate phases. Throughout the entire process, the MITRE Evaluation Team will keep all involved parties informed the progress and findings, well provide briefings findings CMS, And QSSI. staff. Evidence support any weaknesses discovered will consist primarily screen prints, script output, and session data. MITRE will immediately notify CMS, And QSSI. staff significant immediately exploitable vulnerabilities are discovered during the assessment. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; PLANNING 

This section contains information describing the application and environment that will assessed, the scope the assessment, any limitations, and roles and responsibilities staff who will participate the assessment. 
2.1 FEDERAL DATA SERVICES HUB BACKGROUND 
2.1.1 Overview the Marketplace Information Technology (IT) Systems 
The Affordable Care Act directs states establish State-based Marketplaces January 2014. states electing not establish and operate such Marketplace, the Affordable Care Act requires the Federal government establish and operate Marketplace the state, referred Federally-facilitated Marketplace. The Marketplaces will provide consumers access health care coverage through private, qualified health plans, and consumers seeking financial assistance may qualify for insurance affordability programs made available through the Marketplace. 
The insurance affordability programs include the advance payment the premium tax credits, cost-sharing reductions, Medicaid, and the Children's Health Insurance Program (CHIP). The advance payment the premium tax credit may applied automatically the purchase qualified health plan through the Marketplace, reducing upfront the premiums paid consumers. Cost-sharing reductions may also lower the amount consumer has pay out-ofpocket for deductibles, coinsurance, and copayments for qualified health plan purchased through the Marketplace. order enroll insurance affordability program offered through Marketplace, individuals must complete application1 and meet certain eligibility requirements.2 Before get further into this discussion, important note that while the Marketplace application asks for personal information such date birth, name, address, the Marketplace application never asks for personal health information and the Marketplace systems will never access store personal health information beyond what normally asked for Medicaid eligibility applications. 

2.1.2 Federal Data Services Hub 
CMS has developed tool, known the Federal data services hub (the Hub), that provides electronic connection between the eligibility systems the Marketplaces already existing, secure Federal and state databases verify the information consumer provides their Marketplace application. Data transmitted through the Hub will help state agencies determine applicants eligibility enroll Medicaid CHIP, and help the Federally-facilitated and State-based Marketplace eligibility systems determine applicants eligibility seek health insurance coverage through Marketplace, and their eligibility for advance premium tax credits and cost-sharing reductions. important understand that the Hub not database; does not retain store information. routing tool that can validate applicant information from various trusted government databases through secure networks. allows the Marketplace, Medicaid, and CHIP systems query the government databases used today the eligibility processes for many state and Federal programs. The Hub would query only the databases necessary determine 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
eligibility for specific applicants. The Hub increases efficiency and security eliminating the need for each Marketplace, Medicaid agency, and CHIP agency set separate data connections each database. 
CMS has already completed development and the majority the testing the Hub services required support open enrollment October 2013. CMS and the Internal Revenue Service (IRS) are currently testing the integration the Hub with their systems, and this testing was percent complete the end June. CMS started testing the Hub with the other Federal partners, including the Social Security Administration (SSA) and the Department Homeland Security (DHS), earlier this summer, and that testing will completed the end August. CMS currently testing the Hub with states, and during the remainder July and August, will finish testing the Hub with the remaining states and territories. 

2.1.3 Description the Business Process 
CMSs Center for Consumer Information and Insurance Oversight (CCIIO) Private Cloud 

DSH, the Hub, support business functions the State-Based Exchanges (SBEs), Federally Facilitated Exchanges (FFEs), and Federal agencies. The Hub business functions follow: 
	 
Facilitating the exchange data between SBEs, FFEs, and Federal agencies 

	 
Enabling verification coverage eligibility 

	 
Providing aggregation point for the Internal Revenue Service (IRS) when querying for coverage information 

	 
Providing data for oversight the Exchanges 

	 
Providing data for paying insurers 

	 
Providing data for use portals for consumers such, the Hub sits between SBEs, FFEs, and Federal agencies from business process standpoint. Error! Reference source not found. depicts the basic Federal DSH concept. 

Figure Federal Data Services Hub Concept execute these functions, the Hub dependent data services provided SBEs, FFEs, and Federal agencies. Each entity provides Web services available the Hub for exchanging data, verifying coverage data, and determining eligibility. The Hub uses these Web services answer 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
requests from entities. The Hub selects the data sources use when answering request based business rules. This may mean that the Hub uses multiple data sources provide single answer request, which the Hub then returns standard format the requestor. acting central exchange and translation point, the Hub enables the consolidation security requirements, eliminating the need for each entity negotiate trusted connections with each other entity. provide these services the requestors, the Hub needs query different data sources for information. Below listed the business input functions the Hub uses answer these requests. 
Business Input Function  Function Source(s)  
Provide individual coverage data  SBE, FFE  
Provide income data  IRS, Social Security Administration (SSA)  
Provide immigration and citizenship data  Department Homeland Security (DHS)  
Provide incarceration data  DHS  
Provide current coverage data  United States Department Veterans Affairs (VA), TRICARE, Medicaid, Medicare  

The Hub provides Web services that requestors may use take actions request data from various data sources. Each endpoint acts business process. The below table lists the business output functions the Hub provides. 
Business Output Function  Supporting Business Process  
Processing/Calculation   Account Transfer  Advance Payment Computation (APC)  Communicate Eligibility  
Verification eligibility   Verify Annual Household Income (HHI) and Family Size  Verify Current HHI  Verify Incarceration Status  Verify Lawful Presence (VLP)  Verify Non Employer-Sponsored Insurance (ESI) Minimum Essential Coverage (ESC)  

The below table provides description each the supporting business processes.
 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; )HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Business Process Name  Business Process Description  
Account Transfer  The Account Transfer Business Service facilitates the transfer accounts from the requestor Medicaid/CHIP from Medicaid/CHIP the requestor for eligibility determination. This service supports the Exchange-determined Medicaid eligibility based modified adjusted gross income (MAGI). The Exchange assesses potential Medicaid eligibility based MAGI and then assesses non-eligibility for Medicaid/CHIP based MAGI. However, when the individual requests full Medicaid/CHIP determination, the Exchange assesses potential eligibility for Medicaid based factors other than MAGI. Additionally, Medicaid/CHIP determines non-eligibility for Medicaid/CHIP. For each these scenarios, the Exchange Medicaid/CHIP initiates the same Account Transfer Business Service request the Hub, which forwards the account the appropriate agency. The receiving agency performs eligibility determination for each scenario and returns the eligibility response, necessary, the initiator.  
Advance Payment  The APC Business Service performs Advance Payment the Premium Tax  
Computation  Credit (APTC) calculations, determining the maximum amount monthly APTC for which household eligible. The service communicates applicants household Income, percentage Federal Poverty Level (FPL), coverage year, adjusted monthly premium for Second Lowest Cost Silver Plan (SLCSP), and request identifier (ID) IRS. the event that the IRS system down offline, the Hub performs the APTC calculation for new application update during the benefit year. The Hub maintains the applicable percentage table for each coverage year and updates the table for each year after 2014. CMS staff manually triggers updates. The Hub returns flag the requesting party indicating whether IRS the Hub performed the calculation.  
Communicate Eligibility Determination  The Communicate Eligibility Determination Business Service facilitates the storing/writing individuals eligibility determination information from various exchanges (FFE SBEs, Medicaid/CHIP) the CMS common data store (Federal Exchange Program System (FEPS)). Requestors initiate the same service request the Hub, which stores/writes the individuals eligibility determination information the CMS common data store. These requests, with multiple individual records, generally involve the generation and processing batch (asynchronous) requests the requestors.  
Verify Annual Household Income and Family Size  The Verify Annual HHI and Family Size Business Service retrieves tax return information from IRS for use evaluating taxpayer eligibility and enrollee continued eligibility for insurance affordability programs. The Exchange initiates the service request the Hub, which forwards the request IRS. The request communicates applicant full name, Social Security Number (SSN) Adoption Taxpayer Identification Number (ATIN), and Date Birth (DOB) IRS. The Hub adds name control number before submitting the request the IRS. IRS provides the Hub with the most recent tax return information file. For example, eligibility determination occurs late 2013 for coverage 2014, IRS looks first for 2012 tax return. such return available, IRS may provide information from 2011 tax return, 2011 return file. Upon response receipt, the Hub forwards the information back the requesting party.  

Business Process Name  Business Process Description  
Verify Current  The Verify Current HHI Business Service retrieves the Social Security benefit  
Household Income  amount from SSA, quarterly wage information from the trusted data source (TDS), and unemployment insurance income from the TDS. The service uses this information evaluate applicant eligibility and enrollee continued eligibility for insurance affordability programs communicating the individuals full name, SSN, DOB, gender, and State the TDS(s), which provide the Hub with the most recent income information file the time request.  
Verify Incarceration Status  The Verify Incarceration Status Business Service assists determining eligibility communicating individuals full name, DOB, and SSN SSA verify applicant incarceration status. The requestor calls the Verify Incarceration Status Business Service when applicant attests that he/she not currently incarcerated and inputs SSN. The Hub then translates the information disclosed SSA into incarceration status Yes, No, Undisclosed, depending the combination information received from SSA the Hub.  
Verify Lawful Presence  The VLP Business Service retrieves immigration status from DHS for use evaluating eligibility determinations made the Exchange, and verification information for participation Medicaid, the Childrens Health Insurance Program, and the Basic Health Program (BHP). Requestors use this transaction perform initial alien status verification using combination Alien Number, I-94 Number, Student and Exchange Visitor Information System (SEVIS) ID, Visa Number, Passport Number, Receipt Number, Naturalization Number, and Citizenship Number. DHS processes these requests and responds the Hub using Agency3InitVerifResp responses. This results the creation the DHS case number. The Hub passes this response the requestor and includes translation for the LawfulPresenceVerified and FiveYearBarIndicator responses. Additionally, the system can use Portable Document Format (PDF) Binary Files with this service exchange forms from DHS and the requestor. The requestor also able make separate call close open case, even there has not been resolution.  
Verify Non-Employer Sponsored Insurance Minimal Essential Coverage  The Verify Non-ESI MEC Business Service determines whether the individual already eligible for MEC through public health plans, including Medicaid, CHIP, BHP, Medicare, the Veterans Health Program (VHP), TRICARE, and the Peace Corps. Eligibility determination for any one these programs deems the individual ineligible for the Exchange APTC, and Cost-Sharing Reductions (CSRs). The Exchange accepts the request for verification, triggered individual seeking eligibility enroll Qualified Health Plan (QHP), requesting financial assistance, and attesting not eligible for any the public health plans: Medicaid, CHIP, BHP, Medicare, TRICARE, VHP, the Peace Corps. change eligibility for other public health plans can also initiate trigger, the eligibility determination for any MEC plan changes due (for example) loss Medicare coverage. This service then verifies the person not eligible for that particular plan.  

2.2 ASSESSMENT SCOPE 
MITRE tasked with providing comprehensive SCA determine the Federal Data Services Hub (DSH) major application  has properly implemented CMS security standards. According 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

The SCA will examine the management, operational, and technical controls that support the
DSH ensure adherence the 
security level specifications the CMS ARS 
CMSR, PISP, BPSSM, and TRA. adequately perform the SCA, MITRE anticipates that the MITRE Evaluation Team will onsite for Monday August 19, 2013 through Friday August 30, 2013. 
Not onsi
The scope the SCA will the envi 
enment that located the 
shall the following:

 Documentation and interviews encompass the Management and Operations the DSH  

 
 sampling the VMs will examined.	 There base assumption that all the VMs are configured the same and therefore sampling sufficient VMS are running VMS, are running 

onsive
 
conf guration and interaction 
ls-2
 
MITRE will also determine DSH management and support personnel have understanding the CMS Information Security (IS) ARS including CMSR Version 1.5, CMS Technical Reference Architecture, Version 2.1 (TRA), United States Government Configuration Baselines (USGCB) and the NationalChecklist Program (NCP),10 CMS PISP, and BPSSM, appropriate. 
Application testing will performed the 
nment and adherence the CMS Information Security (IS) Assessment Procedure Version 2.011 that establishes uniform approach for the conduct testing the CMS Information Systems for major applications and their underlying component application systems. The following CMS ARS CMSR security control families will the focus for testing: 
Comprehensive Scope Application SCA: 
	 Access Control (AC) except AC-4, AC-16, AC-17, AC-18, AC-19, AC-20, and AC-CMS1 http://usgcb.nist.gov/ and http://web.nvd.nist.gov/view/ncp/repository. http://www.cms.hhs.gov/informationsecurity/downloads/Assessment_Procedure.pdf (March 19, 2009). 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
 
Access Control (AC) 

 
Awareness and Training (AT) 

 
Audit and Accountability (AU) 

 
Security Assessment and Authorization (CA) 

 
Configuration Management (CM) 

 
Contingency Planning (CP) 

 
Identification and Authentication (IA) 

 
Incident Response (IR) 

 
Maintenance (MA) 

 
Media Protection (MP) 

 
Physical and Environmental Protection (PE) 

 
Planning (PL) 

 
Personnel Security (PS) 

 
Risk Assessment (RA) 

 
System and Services Acquisition (SA) 

 
System and Communications Protection (SC) 

 
System and Information Integrity (SI) 

2.3 ASSESSMENT ASSUMPTIONS/LIMITATIONS 
MITRE has identified limitations the planned assessment: 
	 
The application being tested the PreProd environment functionally equivalent the application deployed the production environment. 

	 
QSSI and IDL staff will provide timely responses MITRE requests for information, access systems perform scans, testing and subject matter experts documented the SCA test plan. 

 
The MARS-E v1.0 subset the CMSR, the CMSR will cover the MARS-E v1.0. 

	 
MITRE will not specifically evaluate the DSH against the IRS publication 1075 Federal Tax Information (FTI). 

 
MITRE will collaborate with Booze Allen Hamilton (BAH) needed and directed GTL. 

 
Out Scope: 

onsive.

ponsive  nitors for system availability and performancemo 

mon
 
 tool 
for build automation
 
sponsive. code qualitynio scanning tool Jump Servers  covered the Platform Service (PaaS) 

2.4 DATA USE AGREEMENT 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
The Data Use Agreement (DUA), form CMS-R-0235, must executed prior the disclosure data from the CMS Systems Records ensure that the disclosure will comply with the requirements the Privacy Act, Privacy Rule, and CMS data release policies. must completed prior the release of, access to, specified data files containing protected health information (PHI) and individual identifiers. MITRE has completed and signed this agreement with CMS Reference DUA number 19317; expiration date July 31, 2013. 

2.5 ROLES AND RESPONSIBILITIES prepare for the assessment, the organization(s) and MITRE will identify personnel associated with specific responsibilities. Individuals may have responsibilities that span multiple roles have knowledge pertaining the implementation more than one security control area. This section provides description the roles and responsibilities assist the organization(s) and MITRE determining the appropriate personnel who should available for the assessment. 
2.5.1 Application Developer/Maintainer 
The Application Developer/Maintainer shall have thorough knowledge the application security control requirements for the system and their implementation protect the software application, its data transit and rest, well the implementation and configuration standards utilized the organization. These controls may include access control, audit and accountability, user identification and authentication, software code configuration control, application integrity, and communications protection. During the SCA process and onsite assessment, the Application Developer/Maintainer shall available for planning sessions, interviews, application discussions, providing assistance for using the application, providing documentation under their control, and remediating any weaknesses. 

2.5.2 Business Owner 
The Business Owner responsible for the successful operation the system and ultimately accountable for system security. The Business Owner defines the systems functional requirements, ensures that Security Accreditation (previously referred Certification and Accreditation [CA]) activities are completed, maintains and reports the Plan Action Milestones (POAM), and ensures that resources necessary for smooth assessment are made available the MITRE Evaluation Team (Assessment Contractor). During the SCA process and onsite assessment, the Business Owner shall available for planning sessions, interviews, system discussions, providing documentation, and providing assistance when necessary (access, contacts, decisions, etc.) some cases the Business Owner may the System Owner. 

2.5.3 CMS Facilitator 
The CMS Facilitator member the CMS SCA Team staff responsible for scheduling and communicating information all planning and coordinating meetings well out-briefs associated with the SCA. The CMS Facilitator reserves work space for testing when the tests are conducted CMS facilities. addition, the CMS Facilitator coordinates the logistics between the CMS SCA Team and SCA Stakeholders (application developers, maintainers, technical support, business owners, etc.) The CMS Facilitator responsible for initiating application and system access for the test accounts used during the assessment. the conclusion the 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
assessment, the CMS Facilitator accepts the Security Controls Assessment Report, distributes the final report SCA Shareholders and generates the cover letter associated with it. 

2.5.4 CMS Government Task Lead 
The CMS Government Task Lead (GTL) CMS representative for the Application Developer/ Maintainer and responsible for providing technical information the SCA Team. During the SCA process and onsite assessment, the GTL shall available for planning sessions, interview with their Application Developer/ Maintainer, assisting the Application Developer during application discussions, providing assistance for using the application, and directing the Application Developer/Maintainer remediate any weaknesses. 

2.5.5 Configuration Manager 
The Configuration Manager shall able describe the policy, processes, procedures, standards, and technical measures utilized for configuration management and change control order maintain secure system baseline. The Configuration Manager shall able provide details the application specific system/enterprise configuration/change control processes and documentation, including identification, configuration/change management plan, status accounting, and audit procedures. The baseline could include, but not limited to, software configuration, network infrastructure configuration, and application design and development resources. During the SCA process and onsite assessment, the Configuration Manager shall available for interviews and provide documentation under the Configuration Managers responsibility. 

2.5.6 Contingency Planning Manager 
The Contingency Planning Manager develops the Contingency Plan for system recovery and works with the Business Owner and System Owner determine the critical components and appropriate system recovery strategy based the business impact analysis, system recovery time objective (RTO), and recovery point objectives (RPO). The Contingency Planning Manager develops and maintains the Contingency Plan for the system, ensuring that testing the plan completed based the organizational and business requirements. During the SCA process and onsite assessment, the Contingency Planning Manager shall available for interviews and provide the System Contingency Plan documentation and update process, system contingency testing schedule, and system contingency plan test reports. 

2.5.7 Database Administrator 
The Database Administrator(s) shall have thorough knowledge the database software and the databases that support the system, well the implementation and configuration standards utilized the organization for the software and databases. The Database Administrator shall able describe the processes and procedures for installing, supporting, and maintaining the database software and databases, including secure baseline installation, access control, identification and authentication, backup and restoration, and flaw remediation. During the SCA process and onsite assessment, the Database Administrator shall available for interview, database discussions, execution scripts collect configuration details, providing documentation when necessary, and remediation any weaknesses. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

2.5.8 Information System Security Officer System Security Officer 
The Information System Security Officer (ISSO) System Security Officer (SSO) responsible for ensuring that the management, operational, and technical controls secure the system are place and effective. The ISSO shall have knowledge the following: 
	 
All controls implemented planned for the system 

	 
Security audit controls and evidence that audit reviews occur 

	 
System Security Plan (SSP) and any authorized exceptions security control
 implementations 

The ISSO shall responsible for all security aspects the system from its inception until disposal. During the SCA process and onsite assessment, the ISSO plays active role and partners with the CMS Facilitator ensure successful SCA. The ISSO shall available for interview, provide coordinate the timely delivery all required SCA documentation; and coordinate and schedule interviews between the SCA Team and SCA Stakeholders. The ISSO designated writing and must CMS employee. 

2.5.9 Lead Evaluator 
The Lead Evaluator member the MITRE Evaluation Team and responsible for understanding CMS policies, standards, procedures, system architecture and structures. The Lead Evaluator has limited activities within the SCA scope; reports all vulnerabilities that may impact the overall security posture the system; refrains from conducting any assessment activities that she/he not competent carry out perform manner which may compromise the information system being assessed; and coordinates getting information, documentation and/or issues addressed between the MITRE Evaluation Team, the CMS Facilitator, and the SCA Stakeholders. The Lead Evaluator must develop the Assessment Plan; modify the testing approach, when necessary according the scope the assessment; prepare the daily agenda, preliminary findings worksheets and conduct the Onsite Assessment briefings; and prepare Security Controls Assessment Report (e.g., Findings Report) communicate how the CMS business mission will impacted identified vulnerability exploited. 

2.5.10 Program Manager 
The Program Manager shall have high-level understanding the assessed system, well the ability describe organizational and system policies from enterprise perspective, with which the system shall compliance. The Program Manager shall familiar with access controls, both physical and logical, contingency plans (i.e., alternate sites/storage, system restoration and reconstitution), user identification and authentication, system authorization operate, incident response, resource planning, system and software acquisition, flaw remediation, and system interconnections and monitoring. During the SCA process and onsite assessment, the Program Manager shall available for interview and provide documentation that falls under the Program Managers responsibility. 

2.5.11 System Administrator 
The System Administrator(s) should have thorough knowledge the operating systems for which they are responsible, well the implementation and configuration standards utilized 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
the organization for those operating systems. The System Administrator (s) should able describe the processes and procedures for installing, supporting, and maintaining the operating systems, including secure baseline installation, access control, identification and authentication, backup and restoration, flaw remediation, and use antivirus products. During the assessment, the System Administrator (s) should available establish access the system, interviews, system discussions, execution scripts collect configuration details, and remediation any weaknesses found that could corrected within the assessment timeframe. 

2.5.12 System Owner 
The System Owner responsible for the successful operation the system and accountable for system security. The System Owner also responsible for executing crucial steps implement management and operational controls and ensure that effective technical controls are implemented protect the system and its data. The System Owner formally designates the ISSO. conjunction with the Business Owner, the System Owner responsible for ensuring that Security Accreditation activities are completed and the POAM maintained and reported. During the SCA process and onsite assessment, the System Owner shall available for interview and, with the assistance the systems support staff, ensure that all documentation required for the assessment available the SCA Evaluator. The System Owner may the Business Owner. 

2.6 ASSESSMENT RESPONSIBILITY ASSIGNMENT 
For this assessment, MITRE, CMS, QSSI and IDL staff names have been associated with the specific roles and corresponding responsibilities. The Business Owner may delegate their responsibilities during the engagement, but the name the delegated individual should updated Table which provides details the responsibilities for the assessment based the identified roles and responsibilities provided the preceding Section, Roles and Responsibilities. 
Table Assessment Responsibilities 
Name  Organization  Role  
Kirk Grothe  CMS/OIS/CIISG  Application Developer  
Monique Outerbridge  CMS/OIS/CIISG  Business Owner  
Darrin Lyles  CMS/OIS/CIISG  CMS Facilitator (Lead)  
Hung Van  CMS/OIS/CIISG  CMS Government Task Leader  
Denis Mirskiy/Murali Kotnana  Oracle Denis Mirskiy/Rupinder Singh  MarkLogic  QSSI  Database Administrator  
Tom Schankweiler Darrin Lyles  CMS/OIS CMS/OIS  SSO ISSO  
Bielski, Jim  MITRE  Lead Evaluator  
Karlton Kim  QSSI  Project Manager  
Jagadish Gangahanumaiah  QSSI  Deputy Project Manager  
Balaji Gudi  QSSI  System Linux Administrator  

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Name  Organization  Role  
Sid Telang Priya Aluru  QSSI  Release Manager  Development Release Manager -Production  
David Holyoke  QSSI  Chief Architect  
Kamesh Thota  QSSI  Security Architect  
Dan McGuire  QSSI  Infrastructure Manager  
Roy Mardis  QSSI  Infrastructure Configuration Manager  
Mike Battles/Chris Mason  QSSI  Infrastructure Engineer (RH SME)  

2.7 PHYSICAL ACCESS AND WORK AREA REQUIREMENTS 
can data provided the infrastructure support contractor URS, and MITRE will perform independent testing required. addition, data also MITRE requires access various systems, networks, infrastructure, and facilities. The MITRE Evaluation Team may requires network access the Internet. For network scans, MITRE will 

ponsive
review MITRE Scripts and 
dat 

available for review part the continuous monitoring program. expedite and facilitate testing, each MITRE staff performing testing will utilize two laptops. work area for these individuals needs established and include power, table, and chairs. addition, MITRE staff will require work area and telecommunication services for conducting interviews and analyzing data. 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; ASSESSMENT
 
This section contains information describing the activities performed during the assessment for information collection, enumeration, testing and review. 
3.1 INFORMATION COLLECTION 
MITRE will require access documentation, operating system and network configuration data, and application information order begin the assessment. 
3.1.1 CMS FISMA Controls Tracking System (CFACTS) Name ensure that the final security controls/ findings worksheet can properly loaded the CMS FISMA Controls Tracking System (CFACTS) the end the assessment MITRE must have the correct system name contained within CFACTS. This system name will used correctly populate the System Name field the Final Management Worksheet delivered with the Final Report. 

Acronym: DSH
 

3.1.2 Documentation Requirements 
MITRE must obtain the documentation requested one week prior the onsite Assessment Kick-off meeting. order effectively perform the assessment and prevent delays during the SCA, MITRE must receive the following information that pertains the application and/or system under evaluation prior arriving onsite. Failure receive this information timely manner will impact the assessments quality and MITREs ability determine whether management, operational, and technical controls have been implemented properly. assist MITRE determining the completeness this information and serve checklist, CMS, QSSI and IDL should use Tables 25 guides and include any comments that may applicable (e.g., new system being accredited, SSP Accreditation Form provided, Configuration Management Plan included SSP, server Internet Protocol (IP) addresses, and network diagram included the System Design Document [SDD]). The documentation broken into four categories: 
	 
Mandatory Pre-Assessment Documentation 

	 
Documentation Required Policy (e.g., PISP Integrated Investment and System Life Cycle Framework [Integrated Life Cycle (ILC) Framework]) 

	 
Expected/Supporting Documentation 

	 
Additional Documentation 

Mandatory Pre-Assessment Documentation: The documents Table Mandatory Pre-Assessment Documentation should provided within week after the preliminary call (or within the agreed upon timeframes noted the preliminary call meeting minutes) for use the development the draft test plan. These can draft documents necessary, but final versions must provided least one week prior the on-site assessment. Failure receive these documents could affect the quality the assessment and would ineffective and inefficient use funds for the assessment continue. Starting August, 2012, there may also 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; additional funding required before the onsite testing can proceed all requirements are not addressed prior the scheduled testing date. However, there may special cases which CMS wants the evaluator proceed without all the documentation, such FISMA one-third SCA CMS believes project/system/application placing CMS such great risk that funding may pulled. For the latter, CMS will request the evaluators advice the risk that posed. 
Table Mandatory Pre-Assessment Documentation 
Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
D01  Information System Risk Assessment (IS RA)  RA-3 Risk Assessment  ILC Framework CMS PISP CMSR  
D02  System Security Plan (SSP) SSP Workbook  PL-2 System Security Plan CA-4 Security Certification  ILC Framework CMS PISP FISMA CMSR  
D03  Privacy Impact Assessment (PIA)  PL-5 Privacy Impact Assessment  ILC Framework CMSR  
D04  Contingency Plan  CP-2 Contingency Plan  ILC Framework CMSR  
D05  Uniformed Resource Locators (URL) all Web application interfaces within scope assessment, not documented the SDD, VDD, SSP)  SA-5 Information System Documentation  CMSR   

Documentation Required Policy: CMS Policy requires that system application have the following documents listed Table The absence these documents handled uniform manner. For example, policy requires document D12, Baseline Security Configurations, completed and does not exist, the absence the document will result finding, assuming the security control scope for the assessment. 
Table Documentation Required Policy 
Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
D06  System Design Document (SDD)  SA-3 Life Cycle Support  ILC Framework CMSR  
D07  Version Description Document (VDD)  SA-3 Life Cycle Support  ILC Framework CMSR  
D08  Interconnection agreements, Memorandum  CA-3 Information System Connections  CMSR   

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; )HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
Understanding (MOU) and/or Interconnection Security Agreement (ISA)  SA-9 External Information System Services  
D09  RoB. Included evidence that RoBs have been acknowledged//signed users  PL-4 Rules Behavior  CMSR  
D10  Contingency Plan Test  CP-4 Contingency Plan Testing and Exercises  ILC Framework CMSR  
D11  Configuration and change management process. Include examples change requests (CR) from request implementation production  CM-3 Configuration Change Control CM-4 Monitoring Configuration Changes CM-5 Access Restrictions for Change  CMSR  May documented the SSP; verify that the level detail acceptable.  
D12  Baseline security configurations for each platform and the application within scope and baseline network configurations  CM-2 Baseline Configuration CM-6 Configuration Settings  CMSR   
D13  Security awareness and training (AT) material including evidence staff who have completed training  AT-1 Security Awareness and Training Policy and Procedures AT-2 Security Awareness AT-3 Security Training AT-4 Security Training Records AT-5 Contacts with Security Groups and Associations  CMSR   
D14  Incident response (IR) procedures. Include evidence simulations actual execution procedures  IR-1 Incident Response Policy and Procedures IR- Incident Response Training IR- Incident Response Testing and Exercises IR- Incident Handling IR- Incident Monitoring IR- Incident Reporting IR- Incident Response Assistance  CMSR  Not Applicable, inherited control from the PaaS  
D15  Documentation describing the types audit logging that enabled and the established rules for log review and reporting  AU-6 Audit Monitoring, Analysis, and Reporting  CMSR   
D16  Open Corrective Action Plans (CAP) items from previous security controls assessments  CA-5 Plan Action and Milestones (POAM)  CMSR  When applicable  

Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
D17  System Record Notice (SORN)  PL-5  ILC Framework CMSR  See the Master Helath Insurance Exchange SORN 09-70-0560  

Expected/Supporting Documentation: Table provides list other supporting documents that are applicable application system. Although these documents are not specifically required security policy, the documents should exist based the CMS ILC and should provided MITRE during the assessment they may helpful performing the assessment, determining any special circumstances permissions that vary from the CMS standards  and also used substantiating artifacts. 
Table Expected/Supporting Documentation 
Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
D18  Operations Maintenance (OM) Manual  SA-5 Information System Documentation  ILC Framework CMSR  
D19  Application system (depending assessments scope) backup and storage requirements and procedures. addition, include data retention and media handling/sanitization procedures  CP-6 Alternate Storage Site CP-9 Information System Backup MP-4 Media Storage MP-6 Media Sanitization and Disposal  CMSR  May documented the SSP  
D20  Detailed system/network architecture diagrams with addresses devices that will within scope assessment, not documented the SDD, VDD, SSP)  SA-5 Information System Documentation  CMSR  May documented the SSP  
D21  Security processes, including application account creation and account review policy, password policy and malicious, mobile code, and antivirus policy. For password management, ensure policies cover both end user access well user accounts used for production operations  AC-1 Access Control Policy and Procedures IA-1 Identification and Authentication Policy and Procedures  CMSR   
D22  CMS Security Certification Form (if system previously authorizedTAB  CA-6 Security Authorization  CMSR  When applicable  
D23  Technical Review Board (TRB) and TRA letters include all PDR, DDR and ORR documentation. Primarily for major updates and new applications  CM-3 Configuration Change Control  CMSR  Required determine variances from the CMS Policies and Standards  

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Additional Documentation: Additional documentation Table may requested during the assessment, depending the system/application being assessed. 
Table Additional Documentation 
Document Element  Document/Information Requested  ARS CMSR  Policy  Comments  
D24  Administrator/Operator and User manuals training materials, not documented the SDD, VDD, SSP)  SA-5 Information System Documentation  ILC Framework CMSR  

3.1.3 Script Output and Device Running Configuration Requirements 
MITRE must obtain the database, and operating systems script output, one week prior the onsite assessment Kick-off meeting. Having the script output prior the onsite assessment enables MITRE immediately begin reviewing configuration settings and identifying areas that may require further analysis. Failure receive the output prior the MITRE Evaluation Team arriving onsite will impact the assessments quality and MITREs ability determine whether management, operational, and/or technical controls have been implemented properly. As Is system implementation documentation, including build documents and configuration scripts for servers, will collected and analyzed. 
3.1.4 Application Testing Requirements 
For the list available Hub Web Services (WSDLs) MITRE will refere 

onsive.
Any futher testing that can performed demonstrated will performed against the environment. Based the defined assessment scope, the application roles and responsibilities/privileges are listed Table Error! Reference source not found., Error! Reference source not found. and Error! Reference source not found.. 
Table Application Roles 

Role  Description  Privileges  
Administrator  Administers access control and security functions for the application  Read, write, and execute for all application data  

sponsive.
The following URL, required access the Messaging. 

The MITRE Team Lead will inform the Business Owner, CMS contractors, and CMS Facilitator when application testing complete. Following testing, the Business Owner expected initiate the process de-allocate the security access provided the MITRE test accounts. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

3.2 ENUMERATION 
MITRE will use various methods and tools enumerate the system and security policies. 
3.2.1 Documentation Review 
Prior and during the assessment, the MITRE Evaluation Team will review documents provided CMS, QSSI and IDL. The review will assess whether appropriate management and operational controls have been implemented; however, will also used augment technical controls. For example, the ARS CMSR stipulates that the password length for the information system required eight characters and the SSP documents that the length passwords eight characters, the technical assessment will confirm whether passwords are configured eight characters length. part the assessment and when feasible, MITRE will evaluate the adequacy and completeness the SSP, Information Systems Risk Assessment (IS RA), and Contingency Plan accordance with CMS guidelines and provide feedback. general, the MITRE Evaluation Team will review, but not limited to, the following sample set documentation: SSP, RA, and Contingency Plan. For the complete documentation list, refer Section 3.1.1. During the onsite assessment, MITRE will provide written evaluations the ISRA, SSP, and and use these evaluation documents basis for interview, discussion, and clarification. 

3.2.2 Vulnerability Assessment Tools 
MITRE will work with CMS, QSSI and IDL staff verify and determine that industry standard best practices are reflected the CMS system architecture design. the extent possible, the work performed this task will accomplished MITRE-furnished auditing equipment. The MITRE Evaluation Team may use the following tools during the assessment: 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

The list above not all inclusive. MITRE may use other tools and scripts, needed, and provide test scripts CMS share with necessary support staff. much possible, MITRE will avoid affecting out-of-bounds systems; however, tools may send non-standard network traffic, which could affect non-targeted (out-of-bounds) hosts located the same network. The effects network-based tools will contained within the in-bound portions the target environment the greatest extent possible. 

3.3 TESTING AND REVIEW 
MITRE will perform activities that typically involve both the automated testing security vulnerabilities via software tools, manual analysis, and the evaluation particular aspects the organizations security policies and practices. 
MITRE will perform the following assessment activities: 
	 
Conduct vulnerability testing with full knowledge the system, applications, products,
 configurations, and topology
 

	 
Provide MITRE Evaluation Team members, who have specific knowledge operating systems, architecture transactional Web systems, and Web programming technologies (e.g., Hypertext Markup Language [HTML], Java, JavaScript, Active Server Pages [ASP], cookies, Perl, Common Gateway Interface [CGI], Siebel, WebSphere, and Visual Basic scripting) 

	 
Attempt gain unauthorized user access unauthorized access system resources 

	 
Identify system vulnerabilities based the following items: 

	 
Architecture design and implementation 

	 
Improper, weak, vulnerable configurations 

	 
Non-standard configurations 

	 
Published known weaknesses, bugs, advisories, and security alerts about the specific hardware, software, and networking products used the system 

	 
Common known attacks against the specific hardware, software, and networking products used the system 

	 
Evaluation buffer overflow attacks 

	 
Evaluation Trojan horse attacks 

	 
Evaluation Web application buffer overflow and password vulnerabilities performing tests that include brute force password attacks and buffer overflow 

	 
Perform network reconnaissance scanning identify services (i.e., Telnet, file transfer
 protocol [FTP], etc.) that are available from targeted servers
 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
	 
Conduct interviews with key staff examine management, operational, and technical
 controls
 

	 
Examine documentation ensure adherence CMS policies and standards 

	 
Perform application testing determine adequate security controls are implemented 

	 
Examine database configuration settings 

3.3.1 Interviews 
Interviews will focus review the management, operational, and technical controls associated with the CMSR, CMS TRA security policies, procedures, and standards. Interviews will also help gain better understanding the system environments security posture and will supplement findings identified during the technical testing. When available and applicable, electronic copies additional written documentation will collected for review. Subject matter experts (SME) the following areas will interviewed: 
	 
System architecture and development methodologies 

	 
System security policies 

 processes 

	 
Patch management 

	 
Audits and log analysis 

	 
Contingency planning and backup and recovery 

3.3.2 Observances 
During the course the assessment, the MITRE Evaluation Team will also scrutinize personnel and the physical environment, applicable, determine security policies and procedures are being followed. Examples areas that may included are: 
 MITRE staff are issued visitor badges 

 any form identification requested prior visitor badge issuance 

	 
How employees label and discard output materials 

	 
Are monitors positioned prevent shoulder surfing viewing from windows and open spaces 

 telecommunication and wiring closets are locked 

While onsite and appropriate, the MITRE Evaluation Team will also conduct data center tour determine whether physical controls securing CMS and data are adequate. 

3.3.3 Configuration Review 
During the assessment, the MITRE Evaluation Team will review switch, router, firewall, server and software configurations, and network and application architecture diagrams determine the controls delineated the CMS ARS CMSR policies, CMS Minimum Security Configuration Standards for Operating Systems, and industry best practices (i.e., those outlined the Router Security Configuration Guide published the National Security Agency [NSA] and Defense Information Systems Agency [DISA] Security Technical Implementation Guides [STIG]) are being followed. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

3.3.4 Application Testing 
MITRE will test the DSH ensure proper software development techniques, supported software used, and that the confidentiality, integrity and availability (CIA) data processed the application adhere CMS policies, procedures and standards. Following list activities MITRE will perform: 
	 
Assess input parameters passed the application are checked and validated 

	 
Determine application administrators can remotely access the application via CMS-
approved standards 

	 
Examine implemented access control and identification and authentication techniques 

	 
Test determine the application susceptible cross-site scripting (XSS), structured query language (SQL) injection, other vulnerabilities 

	 
Examine confidential information determine encrypted before being passed
 between the application and browser
 

	 
Determine the application architecture conforms the TRA 

CMS QSSI will provide the appropriate user accounts and logins access the application tested the targeted environment. The user account logins and application access must available MITRE for tests two weeks prior application testing. least one account must have administrative access with the ability adjust the application roles another login. 

3.3.5 Database Server/Instance Testing 
MITRE will evaluate database server and software configurations with the help the appropriate system administrators. MITRE technical staff will work with the system administrators and DBAs view essential, security-relevant configurations and settings. The following list activities that will performed: 

sponsive 
	 Review the results Not Resvu 

vulner bility scans identify known flaws the server version and settings 

	 
Review database security configuration settings determine adequate system protections are implemented 

	 
Interview the system and database administrators concerning database server configurations and security relevant mechanisms 

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; REPORTING 
This section outlines how MITRE will report vulnerabilities during the assessment. 
4.1 SECURITY CONTROLS ASSESSMENT FINDINGS SPREADSHEET 
The SCA findings spreadsheet (Table running tabulation possible findings identified during the assessment that reviewed during daily out-briefs (DOB). Findings are broken out day and then sorted according risk level. For updates previous days findings, the updated cell highlighted yellow. Although high and moderate risk-level findings are discussed during the DOBs, questions pertaining low risk-level findings may raised for clarification. Further details about the spreadsheet columns are listed the following sections. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Table Findings Spreadsheet
 

4.1.1 Row Number 
Each finding has row number included provide easy reference when the spreadsheet printed and reviewed during DOBs. This row number also included the test reports for easy cross reference. 

4.1.2 Weakness brief description the security vulnerability described the Weakness column. 

4.1.3 Risk Level 
Each finding categorized business risk and assigned risk level rating described high, moderate, low risk. The rating is, actuality, assessment the priority with which each vulnerability should addressed. Based CMS current implementation the underlying technology and the assessment guidelines contained with the CMS Reporting Procedure for Information System (IS) Assessments document,12 MITRE will assign these values each Business Risk. The risk ratings are described Table http://www.cms.hhs.gov/informationsecurity/downloads/Assessment_Rpting_Procedure.pdf. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Table Risk Definitions
 
Rating  Definition Risk Rating  
High  Exploitation the technical procedural vulnerability will cause substantial harm CMS business processes. Significant political, financial, and legal damage likely result  
Moderate  Exploitation the technical procedural vulnerability will significantly impact the confidentiality, integrity and/or availability the system data. Exploitation the vulnerability may cause moderate financial loss public embarrassment CMS  
Low  Exploitation the technical procedural vulnerability will cause minimal impact CMS operations. The confidentiality, integrity and availability sensitive information are not risk compromise. Exploitation the vulnerability may cause slight financial loss public embarrassment  

4.1.4 CMSR Security Control Family and Reference 
The CMSR security control family and control number that affected the vulnerability identified the CMSR Security Control Family and the Reference columns. 

4.1.5 Affected Systems 
The systems, URLs, addresses, etc., affected the weakness, are identified the Affected Systems column. 

4.1.6 Ease-of-Fix 
Each finding assigned Ease-of-Fix rating described Easy, Moderately Difficult, Very Difficult, Known Fix. The ease with which the Business Risk can reduced eliminated described using the guidelines Table 
Table Definition Ease-of-Fix Rating 
Rating  Definition Ease-of Fix Rating  
Easy  The corrective action(s) can completed quickly with minimal resources and without causing disruption the system data  
Moderately Difficult  Remediation efforts will likely cause noticeable service disruption:  vendor patch major configuration change may required close the vulnerability  upgrade different version the software may required address the impact severity  The system may require reconfiguration mitigate the threat exposure  Corrective action may require construction significant alterations the manner which business undertaken  
Very Difficult  The high risk substantial service disruption makes impractical complete the corrective action for mission critical systems without careful scheduling:  obscure, hard-to-find vendor patch may required close the vulnerability  Significant, time-consuming configuration changes may required address the threat exposure impact severity  Corrective action requires major construction redesign entire business process Known Fix known solution the problem currently exists. The Risk may require the Business Owner to:  Discontinue use the software protocol  Isolate the information system within the enterprise, thereby eliminating reliance the system some cases, the vulnerability due design-level flaw that cannot resolved through the application vendor patches the reconfiguration the system. the system critical and must used support on-going business functions, less than quarterly monitoring shall  

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Rating  Definition Ease-of Fix Rating  
conducted the Business Owner, and reviewed CMS Management validate that security incidents have not occurred  

4.1.7 Estimated Work Effort 
Each finding has been assigned Estimated Work Effort rating described Minimal, Moderate, Substantial, Unknown. The estimated time commitment required for CMS contractor personnel implement fix for the Business Risk categorized Table 10. 
Table 10. Definition Estimated Work Effort Rating 

Rating  Definition Estimated Work Effort Rating  
Minimal limited investment time (i.e., roughly three days less) required single individual complete the corrective action(s)  
Moderate moderate time commitment, several weeks, required multiple personnel complete all corrective actions  
Substantial significant time commitment, several months, required multiple personnel complete all corrective actions. Substantial work efforts include the redesign and implementation CMS network architecture and the implementation new software, with associated documentation, testing, and training, across multiple CMS organizational units  
Unknown  The time necessary reduce eliminate the vulnerability currently unknown  

4.1.8 Finding detailed description how the finding did not meet the test description. This provides information how the actual results fail meet the security requirement noted the CMS security policy, CMS security requirements, CMS guidance industry best practices published the Defense Information Systems Agency (DISA) Security Technical Implementation Guides (STIG), Center for Internet Security (CIS) database vendors. The finding should have the paragraph from the original report and the date the final report included the description the first line for easy reference the POAMs. 

4.1.9 Failed Test Description 
The expected results that the finding did not meet are documented. This description provides the specific information from the CMS security policy, requirements, guidance, test objective published industry best practices. 

4.1.10 Actual Test Results 
This provides specific information the observed failure the test objective, policy guidance. 

4.1.11 Recommended Corrective Actions 
The recommended actions resolve the vulnerability are explained the Recommended Corrective Actions column. 

4.1.12 Status 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
The Status column provides status information, such when the vulnerability was identified resolved. 

4.2 REASSIGNMENT FINDINGS during the SCA onsite testing period, finding determined outside the scope the system the responsibility the CMS System Business Owner and ISSO, the finding will reported and steps should taken reassign the finding the rightful owner.  The CMS SCA Facilitator will attempt contact the rightful owner, provide them with the appropriate information, and invite them the balance the SCA proceedings.  During the onsite week, the CMS facilitator may assist the CMS System Business Owner and ISSO obtain the rightful owners concurrence and responsibility for the finding.  
However, ultimately the responsibility the CMS System Business Owner and ISSO obtain concurrence the potential finding from the rightful owner and follow through with the necessary reassignment steps prior the Draft Report Review. the finding has already been reported CFACTS, the System Business Owner and ISSO must obtain the CFACTS identifier from the rightful owner and the finding will closed the report noting the re-assignment and CFACTS information the status field. the ownership the finding has not yet been successfully re-assigned the time the Draft Report Review, the report will finalized with the finding assigned the system. then the responsibility the CMS System Business Owner and ISSO address later time and update CFACTS accordingly with the proper information. 
Once finding reassigned, should documented the systems risk assessment (ISRA). The CMS System Business Owner and ISSO should review periodically the finding may directly impact the system. 

4.3 REPORTING OBSERVATIONS 
MITRE will include the finding spreadsheet items that are considered observations instead actual findings. observation may arise result number situations: 
 security policy document may changing and serves inform the system owner. This gives ample time prepare for and make appropriate changes; 

 security policy document has changed and CMS has granted grace period for completion. The observation provides mechanism the business owner/ ISSO that the item requires attention before the end that grace period; 

 possible finding that the Security Assessment Contractor may have observed and cannot verify testing part the existing tasking; 

	 
Issues related industry best practices and that are not identified the CMS Acceptable Risk Safeguards (ARS) other guidelines referenced the ARS. These items are considered Opportunities for Improvement (OFI). 

The observations will also included the SCA report separate section.  Observations 
may may not require additional action the part the CMS Business Owner, ISSO 
QSSI. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3;	 $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

4.4	 REPORTING SQL INJECTION AND CROSS SITE SCRIPTING VULNERABILITIES 
Since the first quarter 2012, SQL Injection and Cross Site Scripting (XSS) attacks have increased almost 70%. SQL Injection and XSS vulnerabilities are frequent issues identified CMS System Security Controls Assessments. The Chief Information Security Officer (CISO) and the Enterprise Information Security Group (EISG) considers all SQL Injection and XSS vulnerabilities discovered CMS systems rated HIGH risk finding whether not the system Internet facing. 

4.5	 TEST REPORTING 
MITRE will also conduct final out-brief, needed, after the onsite assessment completed. Typically, MITRE does not have the opportunity review all the documentation, configurations, and script outputs while onsite and will need additional days finish identifying potential vulnerabilities. this the case, CMS will schedule final out-brief within one week after the onsite assessment completed. 
MITRE will discuss and review all informational evidence remediated findings that supplied CMS, QSSI and IDL. The MITRE Evaluation Team will diligently respond inquiries made CMS, QSSI and IDL concerning the validity findings and acknowledge any areas concern that may occur. The substance evidence will contain any mitigation proof reflective of, and close to, the source the impacted system possible. The manner evidence exchange will tracked and protected the MITRE Team Lead, GTL, CMS Facilitator and authorized Points Contact (POC) for the system(s) tested. CMS authorizes the submission remediation evidence after the onsite dates, the focus should addressing High and Moderate risk findings. order promptly meet schedules, MITRE requests that all evidence remediated findings submitted MITRE the due date established CMS. This typically one week after the final out-brief. 
Approximately three weeks following the final out brief, MITRE will provide draft test report. The test report takes the vulnerabilities identified the findings spreadsheet and reformats and sorts the information conform CMS guidelines contained within the CMS Reporting Procedure for Assessments document. CMS and QSSI and IDL will provided approximately one week review the test report. Following draft test report review conference call that will scheduled CMS, MITRE will generate final test report and data worksheet. The data worksheet will contain all findings not closed during the onsite the remediation period following the assessment. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; LOGISTICS 
5.1 POINTS CONTACT 
The MITRE POCs for the SCA are listed Table 11. 
Table 11. MITRE Evaluation Team Points Contact 

Name  Position  Phone Number  Email Address  
Jim Bielski  Lead Evaluator  (410) 402-2717  jbielski@mitre.org  
Yi-Fang Koh  Database Evaluator  (703) 983-3995  kohy@mitre.org Identified  Application Evaluator  
Paul Klein  Firewall Evaluator  (703) 983-1062  pklein@mitre.org  
Eugene Aronne  Linux Evaluator  (301) 429-2246  earonne@mitre.org  
Barbara Stamps  HaDoop Evaluator  (703) 983-4556  bstamps@mitre.org  
Carmella Thompson  Interviewer  (443) 934-0411  cthompson@mitre.org  

During assessments, testing problems may encountered outside normal working hours and require that staff need contacted. The CMS POCs for the SCA are listed Table 12. 
Table 12. CMS Points Contact 

Name  Position  Phone Number  Email Address  
Jessica Hoffman  CMS/OIS GTL  (410) 786-4458 (O)  jessica.hoffman@cms.hhs.gov  
Jason King  CMS/OIS GTL  (410) 786-7578 (O)  jason.king@cms.hhs.gov  
Jane Kim  CMS/OIS/GTL  (443) 721-4064  jane.kim@cms.hhs.gov  
Kirk Grothe  CMS Maintainer  (202) 407-3015  kirk.grothe@cms.hhs.gov  
Hung Van  CMS Program Manager  (202) 510-6898  hung.van@cms.hhs.gov  
Monique Outerbridge  Business Owner  (202) 465-5075  monique.outerbridge@cms.hhs.gov  
Thomas Schankweiler  ISSO  301-875-1536  thomas.schankweiler@cms.hhs.gov  

The QSSI POCs for the SCA are listed Table 13. 
Table 13. Vendor Points Contact 

Name  Position  Phone Number  Email Address  
Karlton Kim  Program Manager  410-274-5835  kkim@qssinc.com  
Kamesh Thota  Security Officer  571-294-9781  kthota@qssinc.com  
Thomas Swoboda  HUB Architecture Team  (work through Kamesh)  tswoboda@qssinc.com  
David Holyoke  HUB Architecture Team  (work through Kamesh)  dholyoke@qssinc.com  
Dan McQuire  Operational Lead  (work through Kamesh)  dmcguire@qssinc.com  

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Name  Position  Phone Number  Email Address  
Thomas Peters-Hall  CM/Infrastructure team Redhat suport  (work through Kamesh)  tpetersh@redhat.com  
Chris Mason  CM/Infrastructure team Redhat suport  (work through Kamesh)  cmason@redhat.com  
Mike Battles  HUB Architecture Team, Redhat suport  (work through Kamesh)  mbattles@redhat.com  

5.2 TECHNICAL STAFF REQUIREMENTS 
CMS, QSSI and IDL will need available improve the assessments efficiency and accuracy. The interactions with MITRE may include technical consultation, supervised access systems, networks, infrastructures, facilities, and monitoring assessment activities. 
)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 

5.3 ONSITE SCHEDULE 
MITREs onsite schedule can found Table 14. Joint refers the the close coordination testing with the Federally Facilitated Marketplace (FFM).  Joint briefings will conducted save time. 
Table 14. MITRE Onsite Schedule 

Day Date  Time  Meeting  
Mon 8/19  10:00  11:00  Joint Kick off Meeting  
3:30  4:30  FFM DSH Joint SCA Daily Outbrief  
Tue 8/20  9:30  11:00  DSH ISSO/Business Owner Interview  
11:00  Noon  DSH Walk through Demo  
1:00  2:00  DSH Contingency Planning/Disaster Recovery Interview  
2:00  3:00  DSH Configuration Manager Interview  
3:30  4:30  Joint SCA Daily Outbrief  
Wed 8/21  9:30  11:00  DSH Database Administrator Interview  
3:30  4:00  Joint SCA Daily Outbrief  
Thu 8/22  10:00  11:30  DSH Documentation Interview  
1:00  2:30  DSH Application Developer Interview  
3:30  4:00  Joint SCA Daily Outbrief  
Fri 8/23  3:30  4:00  Joint SCA Daily Outbrief  
Mon 8/26  10:30-Noon  (DSH) MIDAS Hadoop SCA Interview  
3:30  4:00  FFM DSH Joint SCA Daily Outbrief  
Tue 8/27  9:30  11:00  DSH MySQL, Interview  
3:30  4:00  Joint SCA Daily Outbrief  
Wed 8/28  3:30  4:00  Joint SCA Daily Outbrief  
Thu 8/29  3:30  4:00  Joint SCA Daily Outbrief  
Fri 8/30  3:30  4:30  Joint SCA Daily Outbrief  

Note that where appropriate, the Business Owner CMS ISSO responsible for establishing interview appointments and teleconference bridges. The CMS Facilitator establishes DOB appointments and teleconference bridges. 

5.4 ASSESSMENT ESTIMATED TIMELINE 
Table describes the estimated timeline for assessment actions and milestones. 
Table 15. Estimated Timeline for Assessment Actions and Milestones 

Action/Milestone  Description  Date(s)  
Provide scripts, data calls, other requests CMS  Lead evaluator provides the scripts, data calls other requests for the Mainframe, Server O/S and Databases, when applicable  Monday July 22, 2013  

)HGHUDO#3;'DWD#3;6HUYLFHV#3;+XE#3;#11;'6+#12;#3;6HFXULW#3;RQWUROV#3;$VVHVVPHQW#3;7HVW#3;3ODQ#3; $XJXVW#3;#21;#19;#15;#3;#21;#19;#20;#22;#3; 
Action/Milestone  Description  Date(s)  
Perform readiness review  Discuss assessment preparations and ensure tasks (e.g., account creation and providing documentation MITRE) are target for completion  Tuesday August 13,2013  
Establish and test accounts  Set and test all test accounts for the assessment  Monday August 12, 2013  
Finalize and deliver Final Test Plan  Update the final test plan include all action items, decisions, interview schedules, and other information from the Draft Test Plan Discussion  August 16, 2013  
Deliver documentation, script output, and configuration output MITRE  Deliver all documentation, script output, and configuration data the MITRE Evaluation Team prior onsite assessment  Monday August 12, 2013  
Perform onsite assessment  Conduct technical testing and management and operations interviews based the assessments scope  August 19-30, 2013  
Conduct final out brief  Review and summarize security vulnerabilities from assessment  Week August 30, 2013  
Last date provide remediation evidence (if authorized CMS Facilitator)  CMS Division Information Security Privacy Management strongly advices that the focus remediation efforts addressing High risk findings, followed Moderate risk findings. application testing will performed subsequent the onsite.  Friday September 2013 (est.)  
Remove security access  Remove security access established for MITRE test accounts  Friday September 2013 (est.)  
Deliver draft report CMS  Put security vulnerabilities identified during the assessment into report format  Monday September 23, 2013  
Review draft report  Answer questions and provide clarification. Only security vulnerabilities reported during the assessment and included the final out brief are included the report  Friday September 27, 2013  
Deliver final report and data worksheet CMS  Edit and clarify the draft report and generate data worksheet  Friday October 2013  
Deliver final book package CMS  Produce and provide hardcopies test scripts, test data, out briefs, the final report, and the data worksheet(s) with containing this information the CMS SCAs GTL  Friday October 11, 2013  

Office Information Services Division Information Security Privacy Management Centers for Medicare Medicaid Services 

Federally Facilitated Marketplace (FFM) 
Security Control Assessment Test Plan 

December 2013 
Final 
Federally Facilitated Marketplace (FFM) Security Control Assessment Test Plan December 2013 
Table Contents 
Introduction ................................................................................................................................1
 
1.1 
Purpose..............................................................................................................................1
 
1.2 
Security Control Assessment Background........................................................................1
 
1.3 
Assessment Process and Methodology .............................................................................2
 
1.3.1 
Phase Planning...............................................................................................2
 
1.3.2 
Phase Assessment ..........................................................................................2
 
1.3.3 
Phase Reporting .............................................................................................3 
Planning .....................................................................................................................................4
 
2.1 
FFM Application Background ..........................................................................................4
 
2.2 
Assessment Scope.............................................................................................................5
 
2.3 
Assessment Assumptions/Limitations ..............................................................................8
 
2.4 
Data Use Agreement.........................................................................................................9
 
2.5 
Roles and Responsibilities ................................................................................................9
 
2.5.1 
Application Developer/Maintainer ..................................................................10
 
2.5.2 
Business Owner ...............................................................................................10
 
2.5.3 
CMS Facilitator ................................................................................................10
 
2.5.4 
CMS Government Task Lead ..........................................................................10
 
2.5.5 
Database Administrator ...................................................................................11
 
2.5.6 
Information System Security Officer System Security Officer...................11
 
2.5.7 
Lead Evaluator.................................................................................................11
 
2.5.8 
Program Manager.............................................................................................12
 
2.6 
Assessment Responsibility Assignment .........................................................................12
 
2.7 
Physical Access and Work Area Requirements..............................................................13 
Assessment...............................................................................................................................14
 
3.1 
Information Collection....................................................................................................14
 
3.1.1 
CMS FISMA Controls Tracking System (CFACTS) ......................................14
 
3.1.2 
Documentation Requirements..........................................................................14
 
3.1.3 
Script Output Requirements.............................................................................17
 
3.1.4 
Application Testing Requirements ..................................................................17
 
3.2 
Enumeration ....................................................................................................................21
 
3.2.1 
Documentation Review....................................................................................21
 
3.2.2 
Vulnerability Assessment Tools ......................................................................21
 
3.3 
Testing and Review.........................................................................................................23
 
3.3.1 
Interviews .........................................................................................................24
 
3.3.2 
Application Testing ..........................................................................................24
 
3.3.3 
Database Instance Testing................................................................................24
 
Federally Facilitated Marketplace (FFM) Security Control Assessment Test Plan December 2013 
Reporting..................................................................................................................................25
 
4.1 
Security Control Assessment Findings Spreadsheet.......................................................25
 
4.1.1 
Row Number....................................................................................................27
 
4.1.2 
Weakness .........................................................................................................27
 
4.1.3 
Risk Level ........................................................................................................27
 
4.1.4 
CMSR Security Control Family and Reference...............................................28
 
4.1.5 
Affected Systems .............................................................................................28
 
4.1.6 
Ease-of-Fix .......................................................................................................28
 
4.1.7 
Estimated Work Effort .....................................................................................29
 
4.1.8 
Finding .............................................................................................................29
 
4.1.9 
Failed Test Description....................................................................................29
 
4.1.10 
Actual Test Results ..........................................................................................29
 
4.1.11 
Recommended Corrective Actions ..................................................................29
 
4.1.12 
Status................................................................................................................30
 
4.2 
Reassignment Findings...............................................................................................30
 
4.3 
Reporting SQL Injection and Cross Site Scripting Vulnerabilities............................30
 
4.4 
Reporting Observations ..................................................................................................30
 
4.5 
Test Reporting.................................................................................................................31 
Logistics ...................................................................................................................................32
 
5.1 
Points Contact.............................................................................................................32
 
5.2 
Technical Staff Requirements.........................................................................................34
 
5.3 
Onsite Schedule ..............................................................................................................35
 
5.4 
Assessment Estimated Timeline .....................................................................................37
 
Federally Facilitated Marketplace (FFM) Security Control Assessment Test Plan December 2013 
List Tables 

Table Assessment Responsibilities ...........................................................................................
 
Table Tier Documentation  Mandatory Pre-Assessment .....................................................
 
Table Tier Documentation Required Two Weeks Prior Onsite ......................................
 
Table Eligibility and Enrollment User Roles............................................................................
 
Table Plan Management User Roles .........................................................................................
 
Table Findings Spreadsheet ......................................................................................................
 
Table Risk Definitions ..............................................................................................................
 
Table Definition Ease-of-Fix Rating ....................................................................................
 
Table Definition Estimated Work Effort Rating ..................................................................
 
Table 10. SCA Evaluation Team Points Contact .....................................................................
 
Table 11. CMS Points Contact .................................................................................................
 
Table 12. Vendor Points Contact..............................................................................................
 
Table 13. Anticipated Staff Requirements for the week 12/9/2013 .........................................
 
Table Anticipated Staff Requirements for the week 12/16/2013......................................
 
Table 15. MITRE Evaluation Team Onsite Schedule for the week December 2013...........
 
Table 16. MITRE Evaluation Team Onsite Schedule for the week December 2013 ..........
 
Table 17. Estimated Timeline for Assessment Actions and Milestones.......................................
 
List Figures 
Figure FFM Application Accreditation Boundary......................................................................
 

Federally Facilitated Marketplace (FFM) Security Control Assessment Test Plan December 2013 Introduction 
1.1 Purpose 
This document describes the security control assessment (SCA) methodology, schedule, and requirements that The MITRE Corporation (MITRE) will use evaluate the Federally Facilitated Marketplace (FFM) application. The goal the SCA Test Plan clearly explain the information MITRE expects obtain prior the assessment, the areas that will examined, and the proposed scheduled activities MITRE expects perform during the assessment. This document meant used the Centers for Medicare Medicaid Services (CMS) and CGI Federal technical managers, network engineers, and system administrators responsible for system operations. 

1.2 Security Control Assessment Background 
MITRE operates federally funded research and development center (FFRDC) providing services the government accordance with the provisions and limitations defined the Federal Acquisition Regulation (FAR) part 35.017. According this regulation, order for FFRDC discharge its responsibilities the sponsoring agency, must have access government and supplier data (e.g., sensitive and proprietary data) and employees and facilities beyond that which common the normal contractual relationship. FFRDC agent, MITRE required conduct its business manner befitting its special relationship with the government, operate the public int