
    STATE OF OHIO, DEPARTMENT OF HUMAN SERVICES, Plaintiff, v. Louis W. SULLIVAN, M.D., et al., Defendants.
    No. C2-91-212.
    United States District Court, S.D. Ohio, E.D.
    March 31, 1992.
    
      Lee Fisher, Atty. Gen., Karen Lorzoris-hak, Asst. Atty. Gen., Columbus, Ohio, for plaintiff.
    Joseph Kane, Asst. U.S. Atty., Columbus, Ohio, Sheila Hegy Swanson, Asst. Regional Counsel, U.S. Dept, of Health and Human Services, Chicago, III, for defendants.
   MEMORANDUM OPINION AND ORDER

GRAHAM, District Judge.

In this action, the State of Ohio (“the State”) seeks review of a final decision of the Secretary of Health and Human Services Departmental Appeals Board (“the Board”) Decision No. 1202, rendered on November 1, 1990, sustaining a penalty disallowance imposed by the Office of Child Support Enforcement (“OCSE”). During a program results audit for fiscal year (“FY”) 1984 and a follow up review, OCSE found that the State was not operating its Child Support Enforcement Program in substantial compliance with the requirements of Title IV-D of the Social Security Act, 42 U.S.C. § 651 et seq, Pursuant to 42 U.S.C. § 603(h)(1), OCSE reduced the State’s federal funds under the Aid to Families With Dependent Children Program (“AFDC”) by $6,672,393.

Plaintiff asserts jurisdiction under 28 U.S.C. § 1331 and seeks review of the Board’s decision under 5 U.S.C. §§ 701-706. Defendant does not dispute this Court’s jurisdiction to review the decisions of the Departmental Appeals Board in cases of this kind. While the question has not been squarely addressed by the Sixth Circuit, its decision in Ohio Department of Human Services v. United States Department of Health and Human Services, 924 F.2d 1059 (1991), suggests that it would hold that the district courts do have jurisdiction in such cases. See also, Bowen v. Massachusetts, 487 U.S. 879, 108 S.Ct. 2722, 101 L.Ed.2d 749 (1988); Michigan Department of Human Services v. Secretary of HHS, 744 F.2d 32 (6th Cir.1984). This Court concludes that it does have jurisdiction and that plaintiff is entitled to judicial review of the decision of the Board under 5 U.S.C. §§ 701-706.

The Ohio Department of Human Services (“ODHS”) is an agency of the State of Ohio authorized to administer Ohio’s AFDC Program under Title IV-A of the Social Security Act, 42 U.S.C. § 601 et seq. The AFDC program provides financial assistance to dependent children whose caretakers are unable to provide adequate care and support for their children without public assistance. Since 1975, Congress has required as a condition of participation in the AFDC Program, and the receipt of federal AFDC funds, that states operate effective child support enforcement programs. 42 U.S.C. §§ 602(a)(27) and 654(13). The State must submit a plan which conforms to the requirements of Title IV-D which is a cooperative federal-state program created to ensure that child support enforcement and paternity establishment services are available in any jurisdiction that participates in the AFDC Program. A state must submit a plan which conforms to the requirements of Title IV-D and its implementing regulations. Such a plan must provide for enforcing the child support obligations of absent parents, locating absent parents, establishing paternity, obtaining child and spousal support and assuring the availability of assistance in obtaining support for all children needing such assistance. 42 U.S.C. §§ 651, 654; 45 C.F.R. § 302.0 et seq. OCSE is responsible for performing program audits to ascertain compliance with these requirements. 42 U.S.C. § 652(a)(4).

In the Child Support Enforcement Amendments of 1984 (§ 9 Public Law 98-378), Congress mandated the Secretary to impose graduated penalties of between 1 and 5 percent of a state’s federal AFDC allotment if the Secretary determined, pursuant to periodic audits conducted by OCSE, that the State was not operating its Child Support Enforcement Program in substantial compliance with requirements of Title IV-D and its implementing regulations “for any quarter beginning after September 30, 1983.” 42 U.S.C. §§ 603(h)(1) and 652(a)(4). Prior to the 1984 amendments, the statute had provided for a flat 5 percent penalty and recognized nothing less than “full compliance.” 50 Fed.Reg. 40120 (1985).

OCSE audited Ohio’s Child Support Enforcement Program for FY 1984 and found that the State was not in substantial compliance with the Title IV-D requirements and proposed a 1 percent penalty. Following a one year opportunity to correct, OCSE conducted a follow up review and found that the State still had not achieved substantial compliance and on January 27, 1989 notified the State that it was imposing a 1 percent reduction for the period January 1, 1988 through December 31, 1988. The State appealed the penalty disallowance to the Board on February 23,1989 and on November 1, 1990 the Board issued its opinion upholding the penalty. On Decern-ber 14, 1990 the State moved the Board to reconsider its decision and on February 21, 1991 the Board denied the State’s request for reconsideration. This action was filed on March 18, 1991 and is now before the Court on the parties’ cross motions for summary judgment.

Before addressing the merits, there are two preliminary matters which require the Court’s decision. Plaintiff has filed a motion to supplement the administrative record by including a copy of the transcript of proceedings in the case of Office of Child Support Enforcement vs. Mississippi, (Departmental Appeals Board Doc. No. 89-3). Plaintiff claims that the Mississippi case involves many of the same legal and factual issues which are before the Board in the present case and that the record in the Mississippi case contains evidence which would assist the Court in evaluating plaintiff’s claim herein that the Board relied on faulty statistical methodologies in upholding the OCSE penalty assessment. Plaintiff acknowledges that as a general rule a court’s inquiry in administrative review cases is confined to the administrative record before the agency at the time the decision was made. Plaintiff argues, however, that there are two exceptions to the general rule which would warrant the granting of the present motion. Plaintiff contends that there is an exception permitting supplementation of the record to show factors the agency should have considered but did not and that there is a further exception permitting supplementation where such would be helpful to explain a “highly technical” administrative record.

It appears that plaintiff’s goal in seeking to have the transcript of proceedings in the Mississippi case included in the record herein is to bring into the record the testimony of two expert witnesses who testified in the Mississippi hearing, Dr. Benjamin Mandell, who was the expert witness for OCSE in the present case, and Dr. Benjamin Tep-ping, an additional expert for OCSE in the Mississippi case. Plaintiff claims that Dr. Mandell made concessions which support plaintiff’s argument here that the Board relied upon a flawed statistical methodology in upholding the disallowance penalty against Ohio and that Dr. Tepping’s testimony in the Mississippi case likewise supports that position.

Plaintiff could have cross-examined Dr. Mandell in the instant case but instead waived its right to an adversarial evidentia-ry hearing under 45 C.F.R. §§ 16.8 and 16.11 and withdrew its request for such a hearing (R. 19 n. 10). Plaintiff rested its case on written submissions, which included affidavits from its statistical expert, Dr. Moeschberger, who criticized the statistical methodologies advanced by Dr. Mandell. The Mississippi testimony concerned the same issues before the Board in the Ohio case. The Mississippi testimony appears to involve, to some extent, the results of reflection upon and further analysis of the evidence presented in the Ohio case but the underlying issues are the same. The Mississippi testimony is cumulative. It is not in any sense newly discovered. Plaintiff presented its position in the present case through the affidavits of Dr. Moeschber-ger. Plaintiff could have developed here additional evidence of the kind contained in the Mississippi transcript but chose not to. Indeed, while plaintiff now seeks to have Dr. Tepping’s testimony in the Mississippi case made a part of this record, it specifically disavows the statistical methodology he advocated. See Memorandum in Support of Plaintiff’s Motion to Supplement Administrative Record, page 8, footnote 7.

The goal of plaintiff’s motion is simply to bring into the record additional testimony questioning Dr. Mandell’s statistical methodology, an issue which plaintiff had the opportunity to fully address in the present proceeding and which it did address through the use of affidavits from its own expert.

In reviewing the Board’s decision under the arbitrary and capricious standard of the Administrative Procedure Act, 5 U.S.C. § 706(2)(A), the Court’s review must be “confined to the full administrative record before the agency at the time the decision was made.” Camp v. Pitts, 411 U.S. 138, 142, 93 S.Ct. 1241, 1244, 36 L.Ed.2d 106 (1973). Plaintiff has not demonstrated that this additional evidence shows that there are additional factors the Board should have considered in arriving at its decision. The additional testimony is merely cumulative. The Board itself made this clear in its decision in the Mississippi case:

As a result, we do not address the validity of OCSE Methodology # 3 here. This does not, however, undercut our decisions in Ohio and New Mexico where OCSE had relied on Methodology #3. In those cases, we found that the States’ evidence was insufficient to rebut OCSE’s evidence about the validity of the method and that, in any event, the States had not shown that any further refinements would change the ultimate results. We also found that further refinements were unlikely to change the ultimate results given the raw sample data.

Mississippi Dep’t of Human Services, Decision No. 1267 at 26 (July 26,1991), State’s Exh. JJ. Nor has plaintiff shown that the Mississippi transcript should be made a part of the present record in order to explain highly technical matters contained in the administrative record. Indeed, it appears that the issue of statistical methodology was fully and adequately addressed in the evidence submitted to the Board and was analyzed and explained in cogent fashion by the Board in its decision. Plaintiff’s motion to supplement the record is denied.

Defendant argues that the Court lacks jurisdiction to entertain the third and eighth causes of action in the plaintiff’s complaint and seeks to exclude from the Court’s consideration the allegations contained in paragraphs 54, 57, 58 and 66 of the complaint, as well as Exhibits V and EE. On May 15, 1991 defendant filed a motion to dismiss and to strike, advancing these arguments. However, consistent with an earlier agreed scheduling order, the Court required the parties to address these issues in their cross motions for summary judgment. Thus, defendant’s motion for summary judgment incorporates by reference its previously filed motion to dismiss. The issue raised is whether the Court has jurisdiction to review the Board’s post-decisional ruling denying plaintiff’s request for reconsideration of Decision No. 1202. Decision No. 1202 was issued on November 1, 1990. On December 14, 1990 plaintiff moved for reconsideration pursuant to 45 C.F.R. § 16.13. The Board denied the motion on February 21, 1991. In the motion for reconsideration, plaintiff contended that Decision No. 1202 was based on a clear error of fact when the Board found that no further modifications in the audit methodology calculations would show Ohio to be in substantial compliance with the child support regulations. Plaintiff urged also that reconsideration should be granted to consider an argument made by the State of Arizona in a different case that OCSE audits are invalid because the statistical methodology used was adopted without notice and comment under the Administrative Procedure Act, 5 U.S.C. § 553. The latter argument had not previously been raised by Ohio in the proceedings before the Board. In support of its first ground for reconsideration, the State offered an additional affidavit from its expert, Dr. Mo-eschberger, purporting to outline the alleged errors of fact made by the Board in Decision No. 1202.

Defendant argues that this Court has no jurisdiction to review the Board’s refusal to reconsider Decision No. 1202 because that refusal is “agency action ... committed to agency discretion by law” under the Administrative Procedure Act, 5 U.S.C. § 701(a)(2) and thus, not subject to judicial review. Defendant further argues that the Court lacks jurisdiction for the additional reason that it must limit its review to the record on which Decision No. 1202 is based and may not entertain issues which plaintiff did not properly raise on that record.

Decision No. 1202 was, on its face, a final disposition of the State’s appeal. The State had been given ample opportunity to submit all of its evidence and arguments prior to the issuance of Decision No. 1202 under the Board’s rules of practice. See 45 C.F.R. part 16.

Simple fairness to those who are engaged in the tasks of administration, and to litigants, requires as a general rule that courts should not topple over administrative decisions unless the administrative body not only has erred but has erred against objection made at the time appropriate under its practice.

United States v. L.A. Tucker Truck Lines, 344 U.S. 33, 37, 73 S.Ct. 67, 69, 97 L.Ed. 54 (1952). The appropriate time for plaintiff to have raised its APA rulemaking claim relating to audit methodology was during the extensive proceedings leading to the Board’s issuance of Decision No. 1202. The State failed to do so and it has offered no excuse for the untimeliness of its argument.

In ICC v. Brotherhood of Locomotive Engineers, 482 U.S. 270, 107 S.Ct. 2360, 96 L.Ed.2d 222 (1987), the Supreme Court found that the judicial review provisions of the Administrative Procedure Act preserved a tradition of nonreviewability with regard to an agency order which denies rehearing of a prior order where rehearing is sought on the grounds of “material error.” The Board’s rules specifically empower it to reconsider a prior decision “where a party promptly alleges a clear error of fact or law,” 45 C.F.R. 16.13. However, after considering plaintiff’s motion and carefully examining the affidavit filed in support of the motion, the Board concluded that the State had not alleged any clear error of fact or law material to the Board’s decision. Ruling on request for consideration, page 2.

In Brotherhood of Locomotive Engineers, the Supreme Court held that

Where ... the Commission refuses to reopen a proceeding [upon a motion for reconsideration] what is reviewable is merely the lawfulness of the refusal. Absent some provision of law requiring a reopening (which is not asserted to exist here), the basis for challenge must be that the refusal to reopen was “arbitrary, capricious, [or] an abuse of discretion.” 5 U.S.C. § 706(2)(A).... More importantly for present purposes, all of our cases entertaining review of a refusal to reopen appear to have involved petitions alleging “new evidence” or “changed circumstances” that rendered the agency’s original order inappropriate. We know of no case in which we have reviewed the denial of a petition to reopen based upon no more than “material error” in the original agency decision. There is good reason for distinguishing between the two. If review of denial to reopen for new evidence or changed circumstances is unavailable, the petitioner will have been deprived of all opportunity for judicial consideration — even on a “clearest abuse of discretion” basis — of facts which, through no fault of his own, the original proceeding did not contain. By contrast, where no new data but only “material error” has been put forward as the basis for reopening, an appeal places before the courts precisely the same substance that could have been brought there by appeal from the original order.... For these reasons, we agree with the conclusion reached in an earlier case by the Court of Appeals that, where a party petitions an agency for reconsideration on the ground of “material error,” i.e., on the same record that was before the agency when it rendered its original decision, “an order which merely denies rehearing of ... [the prior] order is not itself reviewable.” Microwave Communications, Inc. v. FCC, 169 U.S.App.D.C. 154, 156, n. 7, 515 F.2d 385, 387 n. 7 (1974).

482 U.S. at 278-80, 107 S.Ct. at 2365-67 (citations omitted). Here, there was no claim of newly discovered evidence. The affidavit submitted in support of the motion for reconsideration was not newly discovered evidence and was considered by the Board for the limited purpose of determining whether plaintiff had alleged a clear error of fact or law.

The Court agrees with defendant’s position that it should not consider plaintiff’s argument that the OCSE’s statistical sampling methodology was invalid for failure to comply with the rulemaking requirements of the APA because plaintiff failed to raise this issue before the Board prior to its final decision of plaintiff’s appeal. The Court likewise agrees with defendant’s position that the Court does not have jurisdiction to review the Board’s denial of plaintiff’s request for reconsideration.

Plaintiff argues in the alternative that even if the Court should find that it does not have jurisdiction to review the Board’s denial of its request for reconsideration of Decision No. 1202 that the Court should supplement the administrative record in this case by including State’s Exhibit V, the affidavit of its expert, Dr. Moeschberger, submitted in support of the motion for reconsideration. This alternative motion seems to reflect the true motive of the State in attaching Dr. Moeschber-ger’s affidavit to the motion for reconsideration, namely, to attempt to bring into the record additional evidence from its expert which it could have offered but did not offer in the original proceeding before the Board. The Board itself rejected this attempt to supplement the record:

To the extent that the State is improperly trying to supplement the record for our decision, however, we agree with OCSE that this is inappropriate. The purpose of a reconsideration proceeding is not to allow a party an opportunity to submit evidence which it could have offered before ...
The State did not claim that its affidavit (by a statistical sampling expert) is newly discovered evidence. Indeed much of the affidavit simply reiterates the expert’s opinions from his previous affidavits or expresses his opinion on our legal conclusions, which does not constitute evidence on any disputed fact. The affidavit also attempts in part to correct insufficiencies we noted in the expert’s earlier affidavits. These statements could have been submitted before, but were not, and therefore are not properly considered as part of the record for our decision.

Ruling on request for consideration, page 2.

This Court agrees with the Board’s characterization of the affidavit and denies plaintiff’s request to supplement the record. Defendant’s motion to dismiss the third and eighth causes of action is well taken, likewise defendant’s motion to strike Exhibits V and EE is well taken and those motions are hereby granted.

STANDARD OF REVIEW

The standard for judicial review of a final agency decision is whether the decision is arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law. 5 U.S.C. § 706(2)(A). The Court must defer to the Secretary’s construction of the statute and regulations and uphold the decision so long as the Board stated the factual and legal bases for its decision and made no clearly erroneous judgment. Citizens to Preserve Overton Park v. Volpe, 401 U.S. 402, 91 S.Ct. 814, 28 L.Ed.2d 136 (1971); State of Michigan, Department of Social Services v. Schweiker, 563 F.Supp. 797 (W.D.Mich.1983), aff'd, 744 F.2d 32 (6th Cir.1984). “The ultimate standard of review is a narrow one and the court is not empowered to substitute its judgment for that of the agency.” Volpe, 401 U.S. at 415-16, 91 S.Ct. at 823-24. It is not the function of the Court to review the agency’s proceedings de novo, instead it must confine its factual inquiry to the administrative record. Camp v. Pitts, 411 U.S. 138, 142, 93 S.Ct. 1241, 1244, 36 L.Ed.2d 106 (1973); United States v. Carlo Bianchi & Co., 373 U.S. 709, 715, 83 S.Ct. 1409, 1413-14, 10 L.Ed.2d 652 (1963).

Plaintiff asserts seven grounds for reversing the Board’s decision. The Court will consider them seriatim.

1. Retroactive Application of Regulations.

For both the program results audit of FY 1984 and the follow up audit of 1987, Ohio was found to be out of compliance with the audit criteria set forth in 45 C.F.R. § 305.20(a)(1) and (2). The audit criteria contained in these regulations were promulgated as final regulations on October 1, 1985 but made effective by HHS retroactively to apply to federal FY 1984.

The Child Support Enforcement and Paternity Establishment Program under Title IV-D has been in existence since July 1975. Under § 403(h) of the Act as enacted in 1975, a state was subject to a 5 percent reduction of its AFDC funds if an audit found that the state was not in full compliance with statutory and regulatory requirements. OCSE began performing compliance audits after December 31, 1976 but Congress enacted a moratorium on the 5 percent penalty and extended the moratorium several times. Thus, during the first eight years of the program’s operation, no state actually had its AFDC funding reduced.

On August 16, 1984 Congress adopted the Child Support Enforcement Amendments of 1984 which changed the standard of compliance from full compliance to substantial compliance and provided for graduated reductions starting with a reduction of not less than 1, nor more than 2 percent and increasing to a maximum of 5 percent with each consecutive finding that a state is not complying substantially with Title IV-D requirements. Section 9C of the 1984 amendments provides that they “shall be effective on and after October 1, 1983.” OCSE proposed regulations implementing the amendments on October 5, 1984 and issued final regulations on October 1, 1985. The 1985 regulations provided that for the FY 1984 audit period certain listed audit criteria related primarily to administrative or fiscal matters “must be met” and that the procedures required by nine audit criteria related to basic services provided under a Title IV-D plan “must be used in 75 percent of the cases reviewed for each criterion ...” All of these audit criteria were based on sections of 45 C.F.R. part 305 which were originally published in 1976, with minor amendments in 1982.

Thus, under the 1985 regulations, substantial compliance for FY 1984 audits was measured by audit criteria from the existing regulations, but a state had to be providing the required services in 75 percent of the cases requiring them. In follow-up review after a corrective action period, OCSE would examine only the audit criteria that the state had previously failed or had complied with only marginally (that is, in 75 to 80 percent of the cases reviewed for that criterion). 45 C.F.R. 305.10(b) and 305.99, as amended.

Decision, pp. 2, 3.

The audit of Ohio’s IV-D program for FY 1984 resulted in a finding that the State had failed to comply substantially with the requirements of Title IV-D. OCSE found that the State had failed to meet the 75 percent standard for three audit criteria: “establishing paternity,” “support obligations,” and “state parent locator service.” OCSE further found that the State had only marginally met two additional criteria: “enforcement of support obligations” and “individuals not otherwise eligible.” The State proposed a corrective action plan which OCSE approved. After the one year corrective action period, OCSE conducted a follow up review for the period July 1, 1987 through June 30, 1988 and found that the State had failed to achieve substantial compliance with any of the previously unmet audit criteria. This finding lead to the reduction notice which is the subject of this appeal. Decision, p. 4.

Congress expressly made the substantial compliance standard applicable to FY 1984 audits by requiring reductions for states not found to be in substantial compliance in audits “for any quarter beginning after September 30, 1983” and by explicitly making the 1984 amendments effective on October 1, 1983, the first quarter of federal FY 1984. 42 U.S.C. § 603(h)(1). The contested regulation provides that in order to be found in “substantial compliance” for FY 1984, a state must be found to be using the procedures required by each of the audit criteria in at least 75 percent of the cases reviewed by the OCSE auditors. The change from full compliance to substantial compliance is more lenient than the former law, thus, the amendment was favorable to the State. Congress did not define substantial compliance, thus, requiring the Secretary to define it by regulation. Clearly, Congress intended that the Secretary’s regulations defining the new standard would be retroactive so as to give the states the benefit of the new, more lenient standard which Congress chose to apply retroactively. This is tantamount to an express grant of retroactive rulemaking authority. See Bowen v. Georgetown University Hospital, 488 U.S. 204, 109 S.Ct. 468, 102 L.Ed.2d 493 (1988).

The audit criteria themselves had been in existence without substantial change since 1976. Furthermore, the corrective action period of the 1985 regulations operated to give the State over two years to bring its administrative practices within the 75 percent standard. The Secretary’s interpretation of substantial compliance to require adherence to the audit criteria in 75 percent of the cases reviewed is not clearly erroneous. Neither is the Secretary’s decision to apply the new regulation to FY 1984.

2. Failure to Follow Standardized Audit Procedures.

Plaintiff contends that OCSE auditors did not follow sampling procedures set forth in published audit guides and that the Board’s failure to reverse the disallowance penalty on this basis was arbitrary, capricious and an abuse of discretion. Neither Title IV-D nor its implementing regulations requires OCSE to employ any particular procedures or methodologies when conducting program results audits or follow up reviews. Nor does it require the Secretary to adopt any standardized audit procedures. Although OCSE has provided its auditors with audit guidelines, they are not binding rules but only guidelines.

The State asserts two specific instances of failure to follow audit guidelines; first, the review of only fifty-five cases for the “support” criterion when the program results audit guide, according to the State, requires a minimum of one hundred cases. And second, the use of a multi-stage sampling process in the follow up audit instead of the single stage sampling process referred to in the follow up review guide. Each of these departures from the audit guide involve an exercise of professional judgment on the part of the auditors which was entirely consistent with the audit guides and reasonable in light of the circumstances.

The audit guide did not require the review of one hundred cases. Instead, it listed that as one of two alternate courses of action to be followed when a probe sample showed that a state had failed to meet the substantial compliance standard for one or more of the nine functional criteria. Under these circumstances, an auditor could expand the probe sample to one hundred cases or terminate further review of the criteria involved. The latter course of action “should be considered when the state’s performance for any criterion could not be raised to a passing score by expanding the sample for that criterion to a minimum of one hundred cases” (R5-53). The Board found that the circumstances indicated that the State’s performance for this criterion could not be raised to a passing score by expanding the sample to a minimum of one hundred cases. The State could pass the support obligations criterion only if it had taken action in all forty-five of the additional cases, a highly improbable outcome since it had taken action in only thirty-one of the fifty-five cases already reviewed. The Board also found that the failure to expand the probe sample for the support criterion was justified in view of the fact that the State had failed to meet the 75 percent standard for two other audit criteria and thus, could not be found in substantial compliance regardless of whether it passed the support obligations criterion. The Board’s conclusions in this regard were not arbitrary, capricious or an abuse of discretion.

OCSE’s departure from the follow up audit guide in using a multi-stage sampling methodology was necessitated by the State’s failure to provide OCSE with a complete and accurate listing of its Title IV-D case load and without this list the auditors could not draw the statewide sample necessary for single stage sampling. Again, the Board concluded that the auditors’ exercise of professional judgment in resorting to a multi-stage sampling procedure was reasonable under the circumstances and its finding in this regard was neither arbitrary, capricious nor an abuse of discretion.

3. Failure to Adopt Statistical Methodology Through APA Rulemaking.

The Court has held that plaintiff is precluded from arguing that the disallowance penalty must be reversed because the statistical methodologies used by OCSE in its audits were not adopted pursuant to APA rulemaking procedures because plaintiff did not raise this issue before the Board. Assuming arguendo that this issue is properly before the Court, the Court finds that OCSE’s statistical sampling methods are not subject to APA rulemak-ing because they are not legislative or substantive rules and are thus exempt from notice and comment rulemaking requirements. 5 U.S.C. § 553(b)(A) exempts “interpretive rules, general statements of policy, or rules of agency organization, procedure or practice.” These audit methodologies are simply a means of gathering and analyzing evidence on the issue of whether a state has achieved substantial compliance; they do not establish substantive rights or responsibilities, define a standard of conduct or level of care or create or alter any basic program compliance responsibilities. This distinguishes OCSE’s statistical methodologies from the nursing home survey procedures that were held to be subject to APA rulemaking in Estate of Smith v. Heckler, 747 F.2d 583 (10th Cir.1984). Furthermore, the sampling methodologies are unlike the inflexible method for computing employment statistics which the Department of Labor was required to adopt by statute in the case of Batterton v. Marshall, 648 F.2d 694 (D.C.Cir.1980).

4. Error in Permitting OCSE to Recalculate Audit Findings.

In both the program results audit for FY 1984 and the follow up review, OCSE used statistical sampling techniques to determine whether the State had met the 75 percent standard for those audit criteria to which the standard applied. OCSE drew a random sample of the State’s Title IV-D cases for each relevant time period. The process is described in detail in the Board’s decision beginning at page 9 and is discussed in greater length in the next section of this opinion. The State did not attack the use of sampling in general as a basis for determining whether the State had met the audit criteria, nor did the State challenge OCSE’s findings in specific sample cases but raised a number of issues regarding the sampling methods initially used by OCSE. The State supported its position with an affidavit and report from a statistical sampling expert. In responding to the State’s criticism of its statistical methodology OCSE recalculated its results. In reply, the State submitted a second affidavit and report from its expert. The Board then required OCSE to comment on the State’s expert’s opinion. In response, OCSE presented another affidavit and report from its expert in which he submitted additional calculations which supported OCSE’s findings. The State then asked for and received an opportunity to respond and thereafter submitted a third affidavit from its expert, together with another detailed report. The Board carefully reviewed all of this expert testimony and found that OCSE’s expert was more credible and concluded

Based on our examination of the record as a whole, we conclude that OCSE has shown, with the requisite degree of certainty, that the State did not meet the audit criteria at issue here.

Decision, page 17.

The state claims that the Board’s inquiry should have been limited to the OCSE’s original calculations and that the OCSE should not have been permitted to recalculate the audit findings in the midst of the appeal. The Court finds this argument to be without merit. The Board was reviewing the OCSE’s finding that the State was not in substantial compliance with the requirements of Title IV-D and the evidence supporting that finding. The evidence included audit data, which was not challenged by the State. The audit data consisted of the results of random sampling which had to be interpreted by a statistical formula or methodology. The State challenged the methodology originally used by OCSE, and during the proceedings before the Board the OCSE refined its statistical analysis. The raw data never changed, only the method of interpreting it. The State was given a full and adequate opportunity to challenge OCSE’s alternate methodologies and to present its own methodologies and calculations. After analyzing the various methodologies the Board concluded that the evidence supported the finding that the State was not in substantial compliance with the requirements of Title IVD. The Board’s conclusion is supported by the evidence and is not clearly erroneous.

5. Error in Finding Ohio’s Child Support Program to be Not in Substantial Compliance With Federal Regulations.

The State claims that there was not sufficient evidence to support the Board’s finding that Ohio was not in substantial compliance with the requirements of the Act. The State challenges the quality of the statistical evidence relied on by the Board in arriving at its finding that Ohio was not in substantial compliance. As noted, OCSE used statistical sampling techniques to determine whether the State met the 75 percent standard for those audit criteria to which that standard applied. OCSE then used the sample findings to calculate an “efficiency rate” and an “efficiency range” for each audit criteria. The “efficiency rate” is the estimate of the percentage of cases requiring review under an audit criterion which are “action” cases. (An action case is one in which the agency took the action required by the criterion). The “efficiency range” is a measure of the reliability of the statistical calculation otherwise known as the “confidence interval.” The confidence interval is the calculation of the range of values within which the statistician can say with a 95 percent degree of certainty the true value occurs. The Board held that the 95 percent confidence level was the appropriate standard. Under OCSE’s audit procedures, a criterion was considered unmet only if the upper limit of the confidence interval was less than 75 percent.

As the Board pointed out at page 10 of its decision, this approach erred on the side of passing a state where a complete review might well have identified a failure. In other words, it gave the state the benefit of the doubt within the range of values included in the 95 percent confidence interval.

The State raised issues concerning the method OCSE used to calculate the efficiency rates and efficiency ranges for the audit criteria. The State initially contended that OCSE had erred by using a method for determining efficiency rates and ranges which was appropriate for simple random sampling, but not appropriate for two-stage sampling with stratification. The State’s expert cited a treatise on statistical sampling regarding this type of sampling method. OCSE responded by recalculating its audit results using two formulae based on the treatise cited by the State’s expert. These were referred to as Methodology # 1 and Methodology # 2. The State responded with a second affidavit from its expert contending that Methodology # 1 was completely inappropriate and that Methodology #2, while appropriate, used values which had not been properly calculated. He explained how those values should be replaced and offered alternative formulae for calculating efficiency rates. The Board then issued an order requiring OCSE to respond to the State’s expert opinion. OCSE then presented another affidavit and report from its expert which did not deny that Methodology # 1 was inappropriate, but defended Methodology # 2. The OCSE expert nevertheless again recalculated the efficiency rates and efficiency ranges using a procedure known as “ratio estimation.” OCSE’s expert made alternative calculations (Methodologies # 2a and # 3) and prepared scatter diagrams or graphs supporting the assumptions underlying his correlation analysis. The State then asked for and received an opportunity to respond and submitted a third affidavit from its expert, together with a detailed report. While conceding that OCSE’s new methodologies partially corrected for the previous errors, the State’s expert asserted that there were still problems which had not been corrected.

Thus, through a series of exchanges in which the statistical experts for each side presented their positions and critiqued their counterparts, the Board developed an ample record upon which to judge the credibility of OCSE’s statistical evidence. The Board’s analysis of the evidence is cogent and persuasive:
The issue here is properly viewed as an evidentiary question: whether the sample findings are reliable evidence that the State did not meet the 75 percent standard for any of the three criteria. The State did not dispute OCSE’s findings about the sample cases. To evaluate this evidence, we must determine what inferences can validly be drawn from the sample case findings, in accordance with principles of statistical sampling....
The issue here focused on the method used for calculating the 95 percent confidence interval, since OCSE had chosen to adopt that degree of certainty for its findings. Based on our examination of the record as a whole, we conclude that OCSE has shown, with the requisite degree of certainty, that the State did not meet the audit criteria at issue here. First, we find that OCSE Methodologies # 2 and # 3 are valid methods, of a type which would ordinarily be relied on by statisticians. More important, as we discuss in detail later, we find that the limited modifications ultimately proposed by the State’s expert would not result in a finding of substantial compliance.
OCSE’s expert was well-qualified and persuasively attested to the validity of the methods OCSE used, providing supporting analyses. While the State’s expert was also well-qualified, we find his third affidavit to be inadequate to rebut OCSE’s expert’s opinion for the following reasons:
The State’s expert described the assumptions underlying OCSE Methodology # 2 as requiring a constant relationship between total number of cases requiring review for each criterion and the total caseload (which he called a “deterministic” relationship). See note 5 above. OCSE’s expert, however, had said the underlying assumption which rendered the method valid was merely that there was a high positive correlation between the two numbers. The State’s expert did not even acknowledge this difference of opinion; thus, he provided no supporting analysis, or citation to a statistical treatise, which would give greater credence to his opinion about what was required. Contrary to what the State’s expert implied, the particular assumption described by OCSE’s expert is not discredited by the correlation analysis. While the correlation is not constant, OCSE’s expert attested that only a high positive correlation is required. The analysis shows that the correlation coefficient for the criteria at issue here was from + .9275 to +.9869 (where +1 would represent an exact positive correlation). OCSE SAF, pp. 433, 504-509. In other words, the correlation does show that it is logical to assume that the larger the total number of cases, the larger the number of cases requiring review for each criterion.
The State’s expert did not deny the assertion by OCSE’s expert that the ratio estimation technique used in OCSE Methodology # 3 is substantially the same as the estimation procedure proposed by the State’s expert in his second affidavit. OCSE SAF, p. 431; State SAF, pp. 333-34. This undercuts the statement in the State’s expert’s third affidavit that OCSE should have used a more sophisticated formula, even though that formula cannot readily be found in statistical sampling texts.
Even if we were to give more weight to the opinion of the State’s expert than to the opinion of OCSE’s expert, however, we would not accept his opinion stating the legal conclusion that “[ujntil the correct analysis is put forward, ... no one can conclude that Ohio is not in compliance.” State SAF, p. 396. We find that the further modifications he proposed would not show that the State was in substantial compliance. The State’s expert did not himself perform any calculations using what he called the “correct analysis.” Thus, his affidavit at most establishes that such an analysis would “widen” the confidence interval for each criterion; he did not express the opinion that any such widening would be substantial or would have a reasonable possibility of resulting in findings that the 75 percent standard was met for any criterion, nor can we infer from his affidavit that he held this opinion.
Moreover, other evidence in the record establishes to the contrary that such widening would not be sufficient to result in a finding that the State met the audit criteria in the follow-up review—
The State’s expert acknowledged that the efficiency rates were correctly calculated under OCSE Methodology # 3; thus, we can focus on the high ranges of the efficiency ranges (the upper limits of the 95 percent confidence intervals) calculated using that method. OCSE’s calculations showed the following as the upper limits based on sample results of the follow-up review: 45.6 percent of the cases requiring review for “establishing paternity;” 56.1 percent for “support obligations;” and 59.0 percent for “state parent locator service.” OCSE SAF, p. 512. This means that, for the State to achieve the 75 percent standard for each criterion, the confidence intervals would have to increase from the amounts calculated using OCSE Methodology # 3 by 29.4 for “establishing paternity,” 18.9 for “support obligations,” and 16.0 for “state parent locator service.”
The increases required to widen the confidence intervals sufficiently would be many times the standard errors OCSE calculated using its Methodology # 3 (admittedly based on a commonly used ratio estimation technique). For example, the standard error for “establishing paternity” using Methodology # 3 is .0221 and would have to increase by .15 to widen the confidence interval enough to make a difference here. (As explained above, to determine the upper limit you add to the efficiency rate — here 41.3 percent — 1.96 times the standard error.) OCSE SAF, p. 512. Since the State’s expert acknowledged that OCSE had moved in the right direction in its Methodology # 3, it is logical to conclude that the further modifications to the ratio estimator the State’s expert said were necessary would not result in increasing a standard error by over six times the amount of the standard error calculated using Methodology # 3.
Based on our analysis, we conclude that the record supports a finding, with the 95 percent degree of confidence, that the State did not meet the 75 percent standard, for the three criteria at issue here, in the follow-up review.

The Board’s analysis of the evidence and its conclusions are supported by the record and are not clearly erroneous.

6. Error in Placing the Burden of Proof on the State

The State contends that the Board’s decision was “rested upon an underlying assumption that the State had the burden of proving itself in compliance with the child support regulations.” The Court concludes, however, that the Board placed the burden of proof on the OCSE. In the summary of its findings appearing on page 1 of its decision, the Board states:

The statistical sampling evidence submitted here reliably shows that the State failed to achieve “substantial compliance;”

This statement indicates that the Board reviewed the evidence to determine whether or not it did “reliably show” that the State failed to achieve the requirements of the statute. This indicates that the burden of proof was placed on the OCSE. The manner in which the Board analyzed the evidence in its lengthy decision reinforces the conclusion that it placed the burden of proof on the OCSE. At the bottom of page 10 of its decision, the Board states:

We discuss each of these issues below, and explain why we conclude that OCSE has established here with the requisite statistical validity and reliability that the State failed to achieve substantial compliance.

(emphasis added). It is reasonable to conclude from this statement that the Board felt that OCSE had a burden to establish substantial compliance. If the Board had placed the burden of proof on the State, then it would have said that the State had failed to establish the unreliability of the OCSE’s determination. The manner in which the Board required the OCSE to respond to the criticisms raised by the State’s expert likewise indicates that the Board placed the burden of proof on OCSE. At page 15 of its decision, the Board framed the issue regarding the statistical evidence as follows:

The issue here is properly viewed as an evidentiary question: whether the sample findings are reliable evidence that the State did not meet the 75 percent standard for any of the three criteria.

This again indicates that the focus of the Board’s inquiry was whether the evidence supported OCSE’s findings. At page 17 of its decision, the Board states:

Based on our examination of the record as a whole, we conclude that OCSE has shown, with the requisite degree of certainty, that the State did not meet the audit criteria at issue here.

Again, the Board clearly placed the burden on OCSE to show that the State did not meet the audit criteria.

In asserting that the Board placed the burden of proof on the State, the State relies on a statement contained in footnote 10 appearing on pages 18 and 19 of the Board’s decision. In this footnote, the Board commented on the failure of the State’s expert to perform any calculations using his version of the “correct analysis.” The footnote reads as follows:

10. The State’s expert attempted to justify this by asserting that he lacked sufficient time. However, the original problem had been presented to him at least nine months before his last affidavit. Moreover, the State did not ask for any extension of time for its last submission although the Board had liberally granted earlier extensions based on the State’s need to consult its expert. Although the State had originally requested a hearing, the State withdrew this request and did not renew it.
Finally, we note that the Board had previously pointed out that the burden of performing recalculations could be minimized by focusing on the criterion (“establishing paternity”) for which the raw sample data was least favorable to the State. Yet, the State’s expert did not even go as far as expressing what the likelihood would be that a correct analysis would make any difference with respect to this criterion. OCSE’s expert had expressed his professional opinion that a statistician has an obligation, when he raises a question about a technique, to address the question of whether any alleged incorrect calculation is material to the decision to be based on it. OCSE SAF, p. 430. The State’s expert did not dispute this professional opinion, but the State responded by arguing that OCSE had the burden to do any required recalculations because of the earlier mistakes it had made and because it had the statutory duty to determine whether a state was substantially complying. In our view, the State at the very least had the burden to show that recalculations were required and might make a difference in the ultimate conclusion, especially given the raw data here.

This footnote does not support the State’s contention that the Board placed the burden of proof on the State. Instead, it appears that the Board was simply demonstrating how it viewed the evidence, specifically the inferences it drew from the failure of the State’s expert to provide calculations supporting his theories.

The Court concludes that the Board did not place the burden of proof on the State and in any event, the Court concludes that the record as a whole supports a finding that the OCSE satisfied its burden of showing that the State had failed to achieve substantial compliance.

7. Error in Finding That the 75 Percent Standard Reasonably Interprets the Statutory Term “Substantial Compliance”

The State argues that 45 C.F.R. § 305.20(a) which interprets the statutory term “substantial compliance” is arbitrary and capricious. The regulation states that in order to be found to have an effective program in substantial compliance with the requirements of Title IV-D of the Act, the procedures required by the audit criteria must be used in 75 percent of the cases reviewed for each criterion. The State contends that the 75 percent standard is invalid because it is not supported by a sufficient administrative record, has no empirical basis and is arbitrary and capricious. The Secretary’s interpretation of the term substantial compliance is entitled to substantial deference. Indeed, since Congress itself did not expressly define the term, the Secretary’s interpretation must be upheld so long as it is “rational and consistent with the statute.” Sullivan v. Everhart, 494 U.S. 83, 89, 110 S.Ct. 960, 964-65, 108 L.Ed.2d 72 (1990) (quote omitted). The Child Support Enforcement Amendments of 1984 incorporated a change from a “full compliance” to a “substantial compliance” standard. As the Board aptly noted in its decision, the term substantial compliance must be read in light of § 403(h)(3) of the Act which permits a finding of substantial compliance only when any noncompliance is of a technical nature. Clearly, Congress intended that substantial compliance be something more than a minimal standard. In its usual and customary meaning, substantial means being largely, but not wholly, that which is specified. Webster’s 9th New Collegiate Dictionary (1986). The Court concludes that the Secretary’s interpretation is reasonable and consistent with statutory intent.

The Board found that there was an empirical basis for the Secretary’s interpretation of substantial compliance in past performance levels measured through OCSE’s audits:

While audit results from FY’s 1980 and 1981 showed that some states were not yet achieving 75 percent levels, other states were achieving 100 percent levels at that time, and OCSE could reasonably expect all states to be achieving 75 percent levels by FY 1984.

(Decision, p. 8). The Board further noted that the report on audits for FY’s 1984 and 1985 (State’s Exhibit H) shows that 21 states or territories met all the criteria initially and at least 15 other states met them after a corrective action period. Thus, the Court concludes that if an empirical basis was needed for the Secretary’s interpretation of substantial compliance, the evidence shows that such existed.

CONCLUSION

Defendants’ motion to dismiss the third and eighth causes of action of the complaint is granted; defendants’ motion to strike from the complaint the allegations contained in Paragraphs 54, 57, 58, and 66 and to strike Exhibits Y and EE is granted; plaintiff’s motion to supplement the administrative record is denied; plaintiff’s motion for preliminary injunction is denied; plaintiff’s motion for summary judgment is denied; defendants’ motion for summary judgment is granted and the Clerk shall enter final judgment in favor of the defendants dismissing plaintiff’s complaint with prejudice at plaintiff’s costs.

It is so ORDERED.  