Q1.1: Do you have any concerns on the proposed collection of data on conservatism in the PD and LGD estimates? In particular as regards the breakdown into Moc A, B and C?
NA
Q1.2: What is, in your view, the appropriate level for assessing the risk exposure or RWA add-ons imposed due to deficiencies in the IRB approach?
NA
Q1.3: Do you agree to the voluntary collection of the information for LDP portfolios?
NA
Q1.4: What are the main challenges for institutions in this regard?
NA
Q2.1: For which kind of portfolios would you expect that outdated ratings (or other missing information hindering the annual re-rating) are a material driver of variability when comparing institutions RWA on homogeneous benchmarking portfolios?
NA
Q2.2: Assuming the aspect is a material driver of variability when comparing institutions RWA, do you have suggestions or preferences for the data collection on conservatism in application?
NA
Q2.3: Do you see any major technical restrictions in providing these data points? If yes, which?
NA
Q3: Do you agree that the added BM portfolios will serve the purpose of providing a full breakdown of COREP exposure classes into FINREP sectors?
NA
Q4: Which obstacles hinder the reporting of homogeneous portfolios in terms of annual turnover as specified in Annex I? Does this lead to exclusion of a material share of the IRB portfolio?
NA
Q5: Would you be able to report the hypothetical LGDs as described above?
NA
Q6: Would you be able to report the hypothetical LGD IRB without conservative adjustments unsecured as described above?
NA
Q7: Do you see the need to collect weights of economic scenario per time horizon?
NA
Q8: Do you see any issues or lack of clarity in the definition of the data points of template C 106.01 and C 120.01? Do you see any issues in the format of the templates C 106.01 and C 120.01 to report all relevant risk factors and sensitivities for the SBM in an appropriate way?
The definition of the data points in the templates are clear.
Q9: Do you agree with the proposed format for the collection of OFR data for the SBM in templates C 120.02 and C 120.03?
The proposed format is fine for collection of own funds requirements for the sensitivities based method.
Q10: Do you agree with the two proposed points in time for the collection of sensitivity data in relation to the ASA? Do you agree with the proposed point in time for the collection of OFR data? How significant do you deem the additional reporting burden if the collection was extended to additional days in the risk measurement period?
We agree with the proposals. For institutions with that have invested in appropriate technology to build the right infrastructure, the additional reporting should not present significantly additional reporting burden.
Q11: Do you agree with the proposed collection of ASA sensitivity data and own funds requirements data in both the instrument / portfolio base currency specified in the ITS and the institution’s own reporting currency?
Yes, we believe this approach is reasonable for institutions that operate across multiple currency zones.
Q12: Do you see any issues or lack of clarity in the definition in the changes and updates introduced in the list of instruments and portfolio of Annex 5?
No
Q13: Which types of instruments, specific risks, etc. play a particularly important role in your portfolio but are misrepresented / underrepresented in the EBA portfolio?
None
Q14: Which instruments, risk factors and portfolio constellations are considered particularly relevant for benchmarking the ASA and should be included in the benchmarking portfolio (distinguishing by SBM, DRC and RRAO)?
Delta and curvature risks for foreign exchange risk factors when determining own funds requirements under the base currency approach.
Q15: Do you currently make use of any industry standards to exchange instrument specifications in a standardised way? If yes, which standard or standards are most relevant?
We use FIX, FpML, SWIFT specifications/standards.
Q16: Would you deem additional instrument specifications using industry standards beyond the current ITS instructions useful? If yes, how would you use them in the benchmarking exercise?
We would welcome steps towards a common understanding of information shared between staff and technology across different institutions. For example, minimising the use of different syntaxes and the use of different semantics or interpretation of terms. In addition, efforts to unify the many existing standards would reduce economic friction. We believe incorporation of terms sheets such as in section 5, annex 5 is reasonable.
Q17: In your view, which would be the ideal process to integrate such instrument specifications in the benchmarking exercise (e.g. submission of instrument specification to CA for validation, publication of instrument specifications)?
We believe publication of instrument specifications along with the ability to submit to the Competent Authority would be helpful. Over time, as appropriate data regarding these approaches is available, the industry can arrive at an informed opinion regarding the need for any fine-tuning in this matter.
Q18: Concerning instrument parameters depending on the level of risk factors on the booking date (e.g. strike prices), how helpful would you find additional information on these and which process would you envisage?
As long as the additional information is consistent with existing EBA guidance and serves to clarify matters pertaining to aspects such as interpretation and measurement, it would be well received.