我有一个类Autor和一个类Dokument,它们是双向的1到n列表关系(1个Autor可能写了几个文档),我想将它们持久化到Cloud SQL中。orm在package.jdo和package-cloudsql.orgm.中指定
文件类别:
public class Autor {
private String aid;
private String vorname;
private String nachname;
private List<Dokument> dokumente;
public Autor() {
}
public Autor(String vorname, String nachname) {
this.vorname = vorname;
this.nachname = nachname;
this.dokumente = new ArrayList<Dokument>();
}
public String getAID() {
return aid;
}
public String getVorname() {
return vorname;
}
public String getNachname() {
return nachname;
}
public List<Dokument> getDokumente() {
return this.dokumente;
}
public void setAID(String aid) {
this.aid = aid;
}
public void setVorname(String vorname) {
this.vorname = vorname;
}
public void setNachname(String nachname) {
this.nachname = nachname;
}
public void addDokument(Dokument dokument) {
this.dokumente.add(dokument);
}
public void deleteDokument(Dokument dokument) {
this.dokumente.remove(dokument);
}
public void deleteAllDokumente() {
this.dokumente.clear();
}
}
文件类别:
public class Dokument {
private String did;
private Autor autor;
private String titel;
private String text;
private Date datum;
public Dokument() {
}
public Dokument(Autor autor, String titel, String text, Date datum) {
this.autor = autor;
this.titel = titel;
this.text = text;
this.datum = datum;
}
public Dokument(String titel, String text, Date datum) {
this.autor = null;
this.titel = titel;
this.text = text;
this.datum = datum;
}
public String getDid() {
return did;
}
public Autor getAutor() {
return autor;
}
public String getTitel() {
return titel;
}
public String getText() {
return text;
}
public Date getDatum() {
return datum;
}
public void setDID(String did) {
this.did = did;
}
public void setAutor(Autor autor) {
this.autor = autor;
}
public void setTitel(String titel) {
this.titel = titel;
}
public void setText(String text) {
this.text = text;
}
public void setDatum(Date date) {
this.datum = date;
}
}
package.jdo包含以下内容:
<class name="Autor" detachable="true" identity-type="application">
<field name="aid" primary-key="true" persistence-modifier="persistent" value-strategy="identity"/>
<field name="vorname"/>
<field name="nachname" />
<field name="dokumente"/>
</class>
<class name="Dokument" detachable="true" identity-type="application">
<field name="did" primary-key="true" persistence-modifier="persistent" value-strategy="identity"/>
<field name="autor"/>
<field name="titel"/>
<field name="text"/>
<field name="datum"/>
</class>
package-sql.orgm包含以下内容:
<class name="Autor" detachable="true" persistence-modifier="persistence-capable" table="Autor">
<field name="aid" primary-key="true" persistence-modifier="persistent" value-strategy="identity">
<column name="aid" jdbc-type="bigint" length="20"/>
</field>
<field name="vorname" persistence-modifier="persistent">
<column name="vorname"/>
</field>
<field name="nachname" persistence-modifier="persistent">
<column name="nachname"/>
</field>
<field name="dokumente" persistence-modifier="persistent" mapped-by="autor">
<collection element-type="de.hdm.studienarbeit3.dokumente.Dokument"/>
</field>
</class>
<class name="Dokument" detachable="true" persistence-modifier="persistence-capable" table="Dokument">
<field name="did" primary-key="true" persistence-modifier="persistent" value-strategy="identity">
<column name="did" jdbc-type="bigint" length="20" />
</field>
<field name="autor" persistence-modifier="persistent" default-fetch-group="true">
<column name="autor" jdbc-type="bigint" length="20"/>
<foreign-key name="DOKUMENTAUTOR_FK" delete-action="restrict"/>
</field>
<field name="titel" persistence-modifier="persistent">
<column name="titel"/>
</field>
<field name="text" persistence-modifier="persistent">
<column name="text"/>
</field>
<field name="datum" persistence-modifier="persistent">
<column name="datum"/>
</field>
</class>
所以我想在现有的Autor中添加第二个(第三个等等)Dokument。但每次我坚持一个Dokument,它也会创建一个具有我选择的Autor属性的新Autor。例如,我有一个id为1的Autor,名称为"Hans Maier",并想用他作为Autor创建另一个Dokument,但新的Dokument将有一个id为2的Autor和一个不需要的名称"Hans Maier"。我希望新的文档也能连接到Autor 1。
我的servlet中有以下代码:
if (autoraid=="") {
if (vorname == "" | nachname =="") {
resp.getWriter().println("Bitte Autor auswählen oder Namen vollständig ausfüllen.");
} else {
Autor autor = new Autor(vorname, nachname);
Dokument dokument = new Dokument(autor, titel, text, datum);
autor.addDokument(dokument);
DokumentDAO dokumentDao = new DokumentDAO(pmf);
dokumentDao.addDokument(autor, dokument);
}
} else {
AutorDAO autorDao = new AutorDAO(pmf);
Autor autor = autorDao.getAutor(autoraid);
Dokument dokument = new Dokument(autor, titel, text, datum);
autor.addDokument(dokument);
DokumentDAO dokumentDao = new DokumentDAO(pmf);
dokumentDao.addDokument(autor, dokument);
}
autorDao.getAutor返回它通过密钥获得的autor(这很有效),dokumentDao.addDokument()用pm持久化给定的dokument。makePersistent(dokument)
我错过了什么或做错了什么?Datanucleus文档说,在双向关系上使用List时,我必须设置两端http://www.datanucleus.org/products/accessplatform/jdo/orm/relationships.html我正在做的是在Dokument创建者和Autor.addDokument()上设置Autor,但它不起作用。
如果我在持久化之前阅读dokument.getAutor().getAID(),它会返回我希望它连接到的Autor的正确ID,但在数据库中它是一个新的Autor。我怎样才能使它正常工作?
我读过这个http://www.onjava.com/pub/a/onjava/excerpt/chap_07/index2.htmlcreateBook方法正是我想要做的,所以我也测试过做
public void addDokument(Autor autor, Dokument dokument) {
PersistenceManager pm = pmf.getPersistenceManager();
Transaction tx = pm.currentTransaction();
try {
tx.begin();
dokument.setAutor(autor);
autor.addDokument(dokument);
pm.makePersistent(dokument);
tx.commit();
} catch (Exception e) {
System.out.println("Exception: " + e.getMessage());
} finally {
if (tx.isActive()) {
tx.rollback();
}
pm.close();
}
}
在事务中,但结果相同,将创建一个新的Autor。
任何帮助都将不胜感激。
日志上写着:
Feb 18, 2014 1:23:33 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalDateMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
Feb 18, 2014 1:23:33 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalDateTimeMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
Feb 18, 2014 1:23:33 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalTimeMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
Feb 18, 2014 1:23:39 PM org.datanucleus.store.rdbms.query.ForwardQueryResult closingConnection
Information: Reading in results for query "SELECT FROM de.hdm.studienarbeit3.dokumente.Autor ORDER BY nachname asc" since the connection used is closing
Feb 18, 2014 1:23:52 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalDateMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
Feb 18, 2014 1:23:52 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalDateTimeMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
Feb 18, 2014 1:23:52 PM org.datanucleus.store.rdbms.mapping.MappedTypeManager addMappedType
Schwerwiegend: User-defined type mapping class "org.datanucleus.store.mapped.mapping.LocalTimeMapping" was not found. Please check the mapping file class specifications and your CLASSPATH. The class must be in the CLASSPATH.
org.datanucleus.store.mapping.mapping.*曾经在datanucleus核心中(至少在3.1.3版本中),但现在在3.2.9版本中。它已经不在了?它去哪儿了?datanucleus rdbms也是3.2.9版本。
System.getProperty("java.class.path")的输出
[workspace]studienarbeit3warWEB-INFclasses
[workspace]studienarbeit3warWEB-INFlibmysql-connector-java-5.1.28-bin.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedappengine-local-runtime-shared.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedel-api.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsprepackaged-appengine-ant-1.7.1.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsprepackaged-appengine-ant-launcher-1.7.1.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsprepackaged-appengine-jasper-6.0.29.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsprepackaged-appengine-jasper-el-6.0.29.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsprepackaged-appengine-tomcat-juli-6.0.29.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedjsp-api.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libsharedservlet-api.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libappengine-tools-api.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserappengine-api-labsv1appengine-api-labs.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserappengine-endpointsv1appengine-endpoints-deps.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserappengine-endpointsv1appengine-endpoints.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserjsr107v1appengine-jsr107cache-1.8.9.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserjsr107v1jsr107cache-1.1.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libuserappengine-api-1.0-sdk-1.8.9.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3asm-4.0.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3datanucleus-api-jdo-3.2.8.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3datanucleus-appengine-3.0.0.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3datanucleus-core-3.2.9.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3datanucleus-rdbms-3.2.9.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3jdo-api-3.1-rc1.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9liboptuserdatanucleusv3jta-1.1.jar
[workspace]studienarbeit3warWEB-INFlibjavax.servlet.jsp.jstl-1.2.1.jar
[workspace]studienarbeit3warWEB-INFlibjavax.servlet.jsp.jstl-api-1.2.1.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libimplgoogle_sql.jar
[eclipse]pluginscom.google.appengine.eclipse.sdkbundle_1.8.9appengine-java-sdk-1.8.9libagentappengine-agent.jar
正如谷歌的兼容性文档所说,他们的GAE/数据存储插件(datanucleus appengine)只与datanucleus 3.1兼容。在DataNucleus Maven回购中有他们插件的SVN版本(与DataNucleus 3.2/3.3一起使用)(但问谷歌为什么他们懒得发布这个,它已经在他们的SVN中存在了一年或更长时间了)。您的另一个线程表示,您需要DataNucleus 3.2+才能与Google Cloud SQL一起使用。总之,很明显为什么不能混合和匹配DataNucleusjar版本;选择您正在使用的版本并获得合适的jar。