Appendix 9 - FAQ & Troubleshooting
Agreement
Is it possible to proceed with the integration process without an Agreement?
Yes, but only in the UAT environment.
Integration with PagoPA Environments
Is it possible to use the same certificate used for the cashback?
No, you need to request a new Certificate.
Is it possible to use the same API Key used for the cashback?
No, you need to request a new API key.
Where can be found the instructions regarding the generation of the SSL certificates?
All the necessary information can be found in the page.
Where can be found the instructions regarding the generation of the subscription key?
All the necessary information can be found in the page.
Is a VPN required for integration with PagoPA?
No, every interaction between the Batch Service and PagoPA happens through ReST.
How Batch Service interfaces with PagoPA?
It uses exclusively HTTPS.
Batch service configuration
Is the Batch Service code publicly available?
Yes, click here for more details.
How can the Batch Service be customized?
You have to use environment variables.
Is a new version of the batch service required? Or is it possible to use the one used for Cashback?
The new version of the batch service is necessary.
What's the required Java version to execute the Batch Acquirer?
The Batch Acquirer is built with Java 1.8 compatibility. However we test and actively support only Java 11 so we encourage to adopt a JRE version 11.
What are the requirements to build the Batch Service executable from source?
To build the Batch Acquirer from source code the only requirements are a JDK 1.8+ and Maven. We strongly encourage however to use a JDK 11 since we test and actively support only Java 11.
It is possible to run the Batch Acquirer behind a HTTP proxy?
Yes, see the relevant HPAN_SERVICE_PROXY_* options in the section.
Are wire transfers included in the flow?
No
is it possible to provide encrypted PANs?
No, In the input file the information must be provided unencrypted, in the output file the information will be hashed.
What is the maximum file size supported by the Batch Service?
UAT: ~20MB of output file (~200MB of input file)
PROD: ~2GB of output file
Cases of input file discard by the batch service:
Input file with wrong naming convention
Mismatch between the sender code in the file’s name and the sender code present in the file’s records
Cases of record discard by the batch service:
Parsing error (e.g. unsupported date format)
"Type" and "Mandatory" fields not respected. For more details click here
Cases of output file discard by the Agenzia delle Entrate:
None
Cases of record discard by the Agenzia delle Entrate:
The transaction date is after the transmission date
Incorrect data (e.g. incorrect date format)
Obligation not respected
Tax code does not exist
Do I need send again the already transmitted data after i fix the fiscal code (error 1206)?
No, it's not necessary resubmit the data.
How can i verify if a fiscal code is right?
You can use the online service by clicking here.
Possible reports on records by the Agenzia delle Entrate:
Tax code not valid on the date of the transaction
VAT number non-existent
VAT number ceased
VAT number and fiscal code inconsistent
In this case the records are not rejected by the Agenzia delle Entrate.
Are there any example files?
Under ops_resources/example_files there are example for an input, panlist and output files, all without encryption, and under the configuration expected when calling the remote services (transactions with clear pan, and a pan list with the hashing already applied), the output is also defined without encryption, please refer to the guideline for further details on how to manage the input/output possibilities.
What format is the error return file?
It is a CSV. For more details click here.
What must be done in the event of an error?
It is necessary to correct the the merchant's data.
In case of rejection of the record by the Agenzia delle Entrate, is it necessary to proceed with a new submission?
No
Permission errors while reading/writing a file
Those kind of errors occur when the Batch Acquirer attempts to read/write to a file that is placed in a location for which the process does not have permissions.
Double-check that the paths defined with the following variables are existent and writable by the process:
ACQ_BATCH_TRX_INPUT_PATH
ACQ_BATCH_OUTPUT_PATH
ACQ_BATCH_TRX_LOGS_PATH
ACQ_BATCH_HPAN_INPUT_PATH
ACQ_BATCH_SENDER_ADEACK_OUTPUT_PATH
What the SkipLimitExceededException exception means?
org.springframework.batch.core.step.skip.SkipLimitExceededException: Skip limit of 'N' exceeded
This error occurs when an error is encountered, while processing the transaction records, that exceeds the configuration for the process fault tolerance, defined by the following property:
batchConfiguration.TransactionFilterBatch.transactionFilter.skipLimit
The stack-trace following the exception indicates the cause of the error that exceeded the defined limit.
BIN validation error
bin: must match "([0-9]{6}|[0-9]{8})"
This occurs when inserting a Bank Identification Number that does not match the expected length (the standard value is 8, but 6 is at the moment also allowed). In cases where the value has fewer characters, the value must be juxtaposed with zero-padding. See the page.
Missing datasource
Error creating bean with name 'scopedTarget.dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfiguration$Hikari.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.zaxxer.hikari.HikariDataSource]: Factory method 'dataSource' threw exception; nested exception is java.lang.IllegalStateException: Cannot load driver class: org.postgresql.Driver
This error occurs when defining a configuration for the datasource for a database that is not the default in-memory one (HBSQL). In the standard release the drivers for any specific vendor has to be included when starting the Batch Acquirer process.
Logback DBAppender error on Oracle
at java.sql.SQLException: Invalid argument(s) in call at oracle.jdbc.driver.AutoKeyInfo.getNewSql(AutoKeyInfo.java:187) at oracle.jdbc.driver.PhysicalConnection.prepareStatement(PhysicalConnection.java:4342) at ch.qos.logback.core.db.DBAppenderBase.append(DBAppenderBase.java:97) at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84) at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) at ch.qos.logback.classic.Logger.log(Logger.java:765) at org.apache.commons.logging.LogAdapter$Slf4jLocationAwareLog.info(LogAdapter.java:454)
This error occurs when using an Oracle Database, and it's related to the usage of a driver version that does not match the one required for the database and java versions. Check the list of available drivers, and refer to the Logback Appenders Documentation.
Oracle error ORA-08177
java.sql.SQLException: ORA-08177: can't serialize access for this transaction
This is a known issue for the Spring Batch Framework. Refer to the note in the Database connection paragraph, and the official Issue Thread.
Cron expression rule error
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'transactionFilterBatch' defined in URL [jar:file:/app/spe/CAE/semafori/batch-transaction-filter.jar!/BOOT-INF/lib/rtd-ms-transaction-filter-api-batch-1.0-SNAPSHOT.jar!/it/gov/pagopa/rtd/transaction_filter/batch/TransactionFilterBatch.class]: Initialization of bean failed;nested exception is java.lang.IllegalStateException: Encountered invalid @Scheduled method 'launchJob': Cron expression must consist of 6 fields (found 1 in "0")
This error occurs when configuring a scheduled execution, with an invalid cron expression. Refer to the related Oracle Guidelines to create a valid expression.
Validation error
APPLICATION FAILED TO START
Description:
The bean 'jobLauncher', defined in class path resource [it/gov/pagopa/rtd/transaction_filter/batch/TransactionFilterBatch.class], could not be registered. A bean with that name has already been defined in class path resource [org/springframework/batch/core/configuration/annotation/SimpleBatchConfiguration.class] and overriding is disabled.
Action:
Consider renaming one of the beans or enabling overriding by setting spring.main.allow-bean-definition-overriding=true
This error occurs when the default configuration is overwritten, without introducing some of the properties required to execute the process. The following properties are required:
spring.batch.job.enabled: false spring.batch.job.web-application-type: none spring.batch.job.allow-bean-definition-overriding: true
Please refer to the example in ops_resources/example_config
PAN List validation error
Recovered PAN list exceeding a day
When encountering this message “java.lang.Exception: Recovered PAN list exceeding a day” there is a failure in the validation process for the downloaded PAN list.
It’s not a blocking mistake but to delete it you should set the environment variable ACQ_BATCH_HPAN_LIST_DATE_VALIDATION as false
Feign error: read timeout
This error occurs generally during the download phase when calling the endpoint to recover the hpan list. due to the file volume it's expected to wait a substantial amount of time, and both the application ad eventual proxies need to tolerate this wait. For the batch application, the read timeout can be configured with the property feign.client.config.hpan-service.readTimeout.
Out of memory errors
The error occurs when trying to process the hpan files without enough space reserved for the Java process. Try to raise the allocated heap space defined by the variable JAVA_TOOL_OPTIONS (see link).
JDBC error: unable to acquire JDBC connection
The error occurs when using a connection pool that is undersized for the operations to be executed inside the database. In case this error occurs, the suggested action is to extend the connection pool, using the config property spring.datasource.hikari.maximumPoolSize.
It's possible to disable polling on the input files folder?
Yes. To achieve this you should set the variable ACQ_BATCH_SCHEDULED
to false
. In this way the Batch Service will process one file per execution.
Testing
Will the CSTAR-CLI tool (indicated in the UAT test phase) have to be installed on the same machines where the Batch Service will be installed?
CSTAR-CLI contains both shell and python scripts.
An actual installation is needed only for generating input and example output files. This procedure can be executed on a second machine and then the generated files can be copied on the machine running the batch service.
Shell scripts are designed in order to be lightweight and should be executed directly on the machine running the Batch Service. This will ensure accurate results.
If I get certificate expired error what should I do?
When encountering this message “To connect to api.uat.cstar.pagopa.it insecurely, use `--no-check-certificate’.” while testing the 001-mAuth-check-UAT/script.sh you should remove the row that start with wget
command and insert the new command below:
The same thing if you encounter the error while you are testing the scripts “scripts/002-API-key-check-UAT/script.sh” remove the row that start with the wget
command and insert:
Certificate renewal
Generate the CSR file from the template as described here
After the request is sent, wait to receive a mail with the signed certificate
Run the scripts to verify PagoPa integration, follow the steps here
Once the scripts pass, prepare the JKS file as described here
Update the environment variables (file
setenv
):
Restart the Batch Service instance. Make sure that only one instance is running.
Last updated