-
Notifications
You must be signed in to change notification settings - Fork 188
Open
Labels
status: waiting-for-feedbackWe need additional information before we can continueWe need additional information before we can continuestatus: waiting-for-triageAn issue we've not yet triagedAn issue we've not yet triaged
Description
Bug Report
Versions
- Driver: org.postgresql:r2dbc-postgresql -> 1.0.7.RELEASE
- Database: PostgreSQL 16.6 (Debian 16.6-1.pgdg120+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 12.2.0-14) 12.2.0, 64-bit
- Java: 21.0.1-tem
- OS: Linux Ubuntu 22.04 LTS
Current Behavior
When running:
@Autowired
lateinit var dbClient: org.springframework.r2dbc.core.DatabaseClient
...
dbClient.sql(
"""
DROP INDEX IF EXISTS some_index;
CREATE INDEX CONCURRENTLY some_index ON some_table(some_field);
""".trimIndent()
).then().block()
...
getting an exception CREATE INDEX CONCURRENTLY cannot run inside a transaction block
, but there is no transaction during SQL execution:
DEBUG io.r2dbc.postgresql.client.ReactorNettyClient - Response: ReadyForQuery{transactionStatus=IDLE}
Stack trace
CREATE INDEX CONCURRENTLY cannot run inside a transaction block
io.r2dbc.postgresql.ExceptionFactory$PostgresqlNonTransientResourceException: [25001] CREATE INDEX CONCURRENTLY cannot run inside a transaction block
at app//io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:109)
at app//io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:65)
at app//io.r2dbc.postgresql.ExceptionFactory.handleErrorResponse(ExceptionFactory.java:132)
at app//io.r2dbc.postgresql.PostgresqlResult.lambda$getRowsUpdated$0(PostgresqlResult.java:70)
at app//reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:179)
at app//reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:670)
at app//reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:748)
at app//reactor.core.publisher.FluxWindowPredicate$WindowFlux.onNext(FluxWindowPredicate.java:790)
at app//reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:241)
at app//reactor.core.publisher.FluxContextWriteRestoringThreadLocals$ContextWriteRestoringThreadLocalsSubscriber.onNext(FluxContextWriteRestoringThreadLocals.java:118)
at app//io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91)
at app//reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107)
at app//reactor.core.publisher.FluxContextWriteRestoringThreadLocals$ContextWriteRestoringThreadLocalsSubscriber.onNext(FluxContextWriteRestoringThreadLocals.java:118)
at app//reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:880)
at app//reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:805)
at app//reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:163)
at app//io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:696)
at app//io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:948)
at app//io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:822)
at app//io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:728)
at app//reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:129)
at app//reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854)
at app//reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
at app//reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
at app//reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:292)
at app//reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:401)
at app//reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:435)
at app//reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:115)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at app//io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
at app//io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
at app//io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at app//io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
at app//io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:280)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at app//io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
at app//io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1357)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
at app//io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at app//io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:868)
at app//io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799)
at app//io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
at app//io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
at app//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:998)
at app//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base@21.0.1/java.lang.Thread.run(Thread.java:1583)
Table schema
Input Code
CREATE TABLE if not exists some_table (id INT PRIMARY KEY, some_field VARCHAR);
Steps to reproduce
Input Code
@Autowired
lateinit var dbClient: org.springframework.r2dbc.core.DatabaseClient
...
dbClient.sql(
"""
DROP INDEX IF EXISTS some_index;
CREATE INDEX CONCURRENTLY some_index ON some_table(some_field);
""".trimIndent()
).then().block()
...
Expected behavior/code
No exception during execution multiple SQL along with CREATE INDEX CONCURRENTLY
Possible Solution
Additional context
Metadata
Metadata
Assignees
Labels
status: waiting-for-feedbackWe need additional information before we can continueWe need additional information before we can continuestatus: waiting-for-triageAn issue we've not yet triagedAn issue we've not yet triaged