为什么import_created_at过滤不起作用?
我有一个Spring Boot应用程序和MongoDB作为数据库。
我有Mongo收集的items
和2个文档:
{
"_id": {
"product_id": "11",
"contract_id": {
"$numberLong": "1"
}
},
"contract_id": {
"$numberLong": "1"
},
"update": {
"import_created_at": {
"$numberLong": "1661784425743"
},
"product_id": "11",
"status": "COMPLETED"
},
"_class": "com.documents.ItemDoc"
}
{
"_id": {
"product_id": "22",
"contract_id": {
"$numberLong": "1"
}
},
"contract_id": {
"$numberLong": "1"
},
"update": {
"import_created_at": {
"$numberLong": "1661784425999"
},
"product_id": "22",
"status": "COMPLETED"
},
"_class": "com.documents.ItemDoc"
}
我有物品的实体类:
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
@Document(collection = "items")
public class ItemEntity {
@Id
private ItemId id;
@Indexed
@Field(name = "contract_id")
private Long contractId;
@Field(name = "product_id")
private String productId;
@Field(name = "update")
private Update update;
}
和
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class Update {
@Field(name = "import_created_at")
private Long importCreatedAt;
@Field(name = "product_id")
private String productId;
@Field(name = "status")
private String status;
}
和
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class ItemId implements Serializable {
@Field(name = "product_id")
private String productId;
@Field(name = "contract_id")
private Long sellerContractId;
}
和
@Builder
@Data
@AllArgsConstructor
@NoArgsConstructor
@Document
public class ItemEntityFacet {
private List<ItemEntity> itemAggregationResult;
}
和
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class ItemFilters {
private List<String> productIds;
private List<Long> lastUpdates;
}
和
@Getter
@NoArgsConstructor
public class ItemMatchOperation {
private Map<String, Object> conditions;
private AggregationOperation matchOperation;
@Builder
public OfferMatchOperation(Map<String, Object> conditions) {
this.conditions = conditions;
}
private ItemMatchOperation(Map<String, Object> conditions, AggregationOperation operation) {
this.conditions = conditions;
this.matchOperation = operation;
}
public static class ItemMatchOperationBuilder {
private AggregationOperation facetOperation;
public ItemMatchOperation build() {
Validate.notEmpty(conditions, "conditions cannot be null or empty!");
buildOperation();
return new ItemMatchOperation(conditions, facetOperation);
}
private void buildOperation() {
facetOperation = context -> new Document("$match",buildMatchConditions());
}
private Document buildMatchConditions() {
Document document = new Document();
conditions.entrySet().stream().forEach(entry -> document.append(entry.getKey(), entry.getValue()));
return document;
}
}
}
我有存储库类:
@Slf4j
@Repository
@RequiredArgsConstructor
public class ItemsRepositoryImpl implements ItemsRepository {
private final ReactiveMongoTemplate mongoTemplate;
public Flux<ItemEntity> findAllByFilters(Long contractId, ItemFilters filters, Pageable pageable) {
Aggregation aggregation = getAggregation(contractId, filters, pageable);
return mongoTemplate.aggregate(aggregation, "items",
ItemEntityFacet.class)
.map(ItemEntityFacet::getItemAggregationResult)
.filter(itemEntities -> !itemEntities.isEmpty())
.flatMap(Flux::fromIterable);
}
private Aggregation getAggregation(Long contractId, OffersListingFilters filters, Pageable pageable) {
List<AggregationOperation> aggregationOperationList = new ArrayList<>();
if (CollectionUtils.isNotEmpty(filters.getProductIds())) {
aggregationOperationList.add(getProductIdMatchOperation(contractId,
filters.getProductIds()));
}
if (CollectionUtils.isNotEmpty(filters.getLastUpdates())) {
aggregationOperationList.add(getLastUpdatesMatchOperation(filters.getLastUpdates()));
}
aggregationOperationList.add(getContractMatchOperation(contractId));
aggregationOperationList.add(getFacetOperation(pageable));
return Aggregation.newAggregation(aggregationOperationList);
}
private AggregationOperation getProductIdMatchOperation(Long contractId, List<String> productIds) {
return ItemMatchOperation.builder()
.conditions(Map.of("_id", new Document("$in", getDocIdsToMatch(contractId, skus))))
.build()
.getMatchOperation();
}
private List<Document> getDocIdsToMatch(Long contractId, List<String> productIds) {
return productIds.stream().map(productId -> new Document("product_id", productId)
.append("contract_id", contractId))
.collect(Collectors.toList());
}
private AggregationOperation getLastUpdatesMatchOperation(List<Long> lastUpdates) {
return ItemMatchOperation.builder()
.conditions(Map.of("update", new Document("$in", getDocToMatchLastUpdates(lastUpdates))))
.build()
.getMatchOperation();
}
private List<Document> getDocToMatchLastUpdates(List<Long> lastUpdates) {
return lastUpdates.stream().map(lastUpdate -> new Document("import_created_at", lastUpdate))
.collect(Collectors.toList());
}
private AggregationOperation getContractMatchOperation(Long contractId) {
return ItemMatchOperation.builder()
.conditions(Map.of("contract_id", contractId))
.build()
.getMatchOperation();
}
...
}
当我在没有过滤器的情况下调用回购方法findAllByFilters
时(仅根据contractId=1(,它会按预期返回2个文档
当我添加productId作为过滤器,并通过contractId=1和productId=11获得它时,它会按预期返回一个文档
但是当我调用contractId=1和lastUpdate=16617844425743时,它会返回无,但应该返回第一个文档。这里怎么了?
根据注释,上面的代码生成了一个类似于的查询
{
"aggregate" : "collection",
"pipeline" : [
{ "$match" :
{
"update": { "$in" : [{ "import_created_at" : 1661784425743}]
}
}
]
}
此语法试图对整个嵌入文档执行相等匹配,需要整个筛选器的精确匹配(包括字段顺序(。您可以在这个Mongo Playground中看到这方面的演示,其中第二个文档由于有趣的原因而不匹配。
相反,您将希望使用点表示法来查询嵌套字段。pipeline
的语法如下:
[
{
$match: {
"update.import_created_at": {
$in: [
166178442574
]
}
}
}
]
相关的Mongo Playground示例显示两个文档都按预期返回。