We have got one error as Packet for query is too large (56997142 > 33554432). You can change this value on the server by setting the max_allowed_packet' variable.
Can anyone help us in this?
I believe you are getting this error because of the mysql limitation. See /etc/my.cnf and there is an entry for max_allowed_packet=32M (default). 32 MB approximately equates to 33554432. You will need to change the value to about 60M
Thank you Seenu for the reply.But as a part of performance testing we had only around 112 request /sec for a service.
Which i assume to be less than 32 MB.Can you please add ypour comments with more possible reason .
We will need to know which assertion is throwing this error. The full policy and the message you are sending when you get this error.
What does your policy look like?
What are you doing within the policy?
Principal Consultant, CA API Management Presales
Email = Derek.Orr@ca.com<mailto:Derek.Orr@ca.com>
CA API Management Community: https://communities.ca.com/community/ca-api-management-community
Did you need further assistance on this one?
Thank u Seemu. Issue has been resolved.
As per mysql document,
A communication packet is a single SQL statement sent to the MySQL server, a single row that is sent to the client, or a binary log event sent from a master replication server to a slave.
Do you have jdbc query in your policy and trying to fetch big data?
The actual alert for this issue is Could not execute JDBC batch update; SQL [insert into audit_detail_params (audit_detail_goid, position, value) values (?, ?, ?)]; Packet for query is too large (89857591 > 33554432).
There are no jdbc query in our policy.I need to find the root cause from where and why this queri is getting fired.Kindly suggest.
The SQL statement is for auditing, there could be an Add audit detail assertion in your policy to save a big size message. Or maybe there is Audit Message in Policy assertion in your policy to save (big size) request/response. Or maybe it's due to your custom audit sink policy, etc.
Retrieving data ...