I am reading the excellent "Wrox Professional Oracle WebLogic server", it recommends to always assign all services (Proxz Services in OSB) a Work Manager with Max Thread Contraint = max Connection in Pool for any DS it's using. So far so good.
What I didn't know is that in older ( <12 ) versions of weblogic if you name the MaxthreadConstraint as the Datasource, the max number of threads will adjust automatically to match the Maximum Capacity of the Datasource. This is cool.
In WebLogic 12, you can do this explicitly in the console:
12>
Saturday, March 30, 2013
Thursday, March 28, 2013
JSP compiler, difference between Tomcat and WebLogic
In Tomcat I had some rather nasty EL code to get a BLOB value from a table:
In Tomcat this was working, in WebLogic it was giving:
Syntax error in expression. Encountered "(". Expected one of : "}", ".", "[", ">", "gt", "<", "lt", ">=", "ge", "<=", "le", "==", "eq", "!=", "ne", "&&", "and", "||", "or", "*", "+", "-", "?", "/", "div", "%", "mod",
It turns out that in WebLogic you must change the syntax to:
<c:set var="DATA_VALUE" value="${msgguidDataResult.rowsByIndex[0][0]}" scope="request"/>
It's just one of those things...
<c:set var="sqlStatement2" value="select DATA_VALUE from ${datatable} where MSG_GUID = '${theMSGGUIDvalue}' "/> <sql:query var="msgguidDataResult" dataSource="${domainds}"> ${sqlStatement2} </sql:query> <c:set var="DATA_VALUE" value="${msgguidDataResult.rowsByIndex[0][0]}" scope="request"/>
In Tomcat this was working, in WebLogic it was giving:
Syntax error in expression. Encountered "(". Expected one of : "}", ".", "[", ">", "gt", "<", "lt", ">=", "ge", "<=", "le", "==", "eq", "!=", "ne", "&&", "and", "||", "or", "*", "+", "-", "?", "/", "div", "%", "mod",
It turns out that in WebLogic you must change the syntax to:
<c:set var="DATA_VALUE" value="${msgguidDataResult.rowsByIndex[0][0]}" scope="request"/>
It's just one of those things...
Monday, March 25, 2013
weblogic keep generated java from jsp
in your weblogic.xml, enter this:
Make sure that /opt/oracle/apps/wlobelix/tmp exists, and is writable by the unix user running WLS.
Redeploy, restart - eventually before restart delete the weblogic server tmp directory.In the Configuration tab of the webapplication, make sure the checkboxes are active:
BIG WARNING: remember to delete the content of the /opt/oracle/apps/wlobelix/tmp folder at each new deployment, otherwise you might not get the latest JSP version.
<weblogic-web-app xmlns="http://www.bea.com/ns/weblogic/90" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <jsp-descriptor> <keepgenerated>true</keepgenerated> <working-dir>/opt/oracle/apps/wlobelix/tmp</working-dir> <backward-compatible>true</backward-compatible> <debug>true</debug> </jsp-descriptor>
Make sure that /opt/oracle/apps/wlobelix/tmp exists, and is writable by the unix user running WLS.
Redeploy, restart - eventually before restart delete the weblogic server tmp directory.In the Configuration tab of the webapplication, make sure the checkboxes are active:
BIG WARNING: remember to delete the content of the /opt/oracle/apps/wlobelix/tmp folder at each new deployment, otherwise you might not get the latest JSP version.
Sunday, March 24, 2013
Book review: The Unmentionable World of Human Waste, and Silent Spring
Extraordinarily well written and witty, this book reveals how bad sanitation (=WC) is the highest cause for child death in the world, being the root cause of diarrhea, which kills much more than AIDS, cholera and malaria together.
The author has traveled the world investigating the cultural and political aspects of sanitation, and addresses this rarely mentioned topic with great technical knowledge.
http://en.wikipedia.org/wiki/Silent_Spring
Silent Spring is a old book but still very actual, on the topic of the poisoning of all creatures by human waste - mainly pesticides, herbicides (=biocides) and fertilizers. A must read for all ecologists.
Labels:
books
Saturday, March 23, 2013
WLST: redeploy application
connect('Pierluigi', 'weblogic1', 't3://acme.com:7001') redeploy('wlobelix') exit()
a lot better than clicking on a console....
Labels:
WLST
Eclipse: The project cannot be built until build path errors are resolved
but of course there is no build path error, and refreshing, closing the project and opening it again, swearing at Eclipse and his mother will not help.
What does help is to restart Eclipse. Just like Windows, it takes a few restarts a day to keep it working. Two applications written by uncoordinated morons.
Labels:
eclipse
Java compiler level does not match the version of the installed Java project facet.
I have created a webapp artifact with maven:
mvn archetype:generate -DgroupId=com.acme.osb -DartifactId=wlobelix -DarchetypeArtifactId=maven-archetype-webapp -DinteractiveMode=false
mvn eclipse:eclipse -Dwtpversion=2.0
When I open the project with Eclipse, I get:
Java compiler level does not match the version of the installed Java project facet.
Faceted Project Problem (Java Version Mismatch)
Don't panic: it's normal with Maven that things take 10 times more than normal and require a lot of manual hacks and desperate googling.
Just right-click on the project in Eclipse, Project facets, and set it to 1.6 (I assume your Workspace default is 1.6). For some strange reason Maven defaults to 1.4. My grandmother - deceased in 2009 at the age of 97 - used to code in 1.6, so I guess that Maven authors are older than her - maybe they are Trilobites, who ruled the Earth some 500 million years ago. Welcome to the 21st century, pals.
mvn archetype:generate -DgroupId=com.acme.osb -DartifactId=wlobelix -DarchetypeArtifactId=maven-archetype-webapp -DinteractiveMode=false
mvn eclipse:eclipse -Dwtpversion=2.0
When I open the project with Eclipse, I get:
Java compiler level does not match the version of the installed Java project facet.
Faceted Project Problem (Java Version Mismatch)
Don't panic: it's normal with Maven that things take 10 times more than normal and require a lot of manual hacks and desperate googling.
Just right-click on the project in Eclipse, Project facets, and set it to 1.6 (I assume your Workspace default is 1.6). For some strange reason Maven defaults to 1.4. My grandmother - deceased in 2009 at the age of 97 - used to code in 1.6, so I guess that Maven authors are older than her - maybe they are Trilobites, who ruled the Earth some 500 million years ago. Welcome to the 21st century, pals.
Labels:
maven,
mavensucks
Thursday, March 21, 2013
java package 'weblogic.time' has no attribute 'localtime'
In a WLST module I am trying to print the current time with:
time.strftime("%Y%m%d_%H%M%S", time.localtime())
When I run within a wlst session, it works. When I run it from a script, it fails with the error
java package 'weblogic.time' has no attribute 'localtime'
I guess that some wlst operations import weblogic.time under the cover.
The only workaround I found is:
from time import localtime, strftime
strftime("%Y%m%d_%H%M%S", localtime())
This is REALLY silly, but this is real life.
time.strftime("%Y%m%d_%H%M%S", time.localtime())
When I run within a wlst session, it works. When I run it from a script, it fails with the error
java package 'weblogic.time' has no attribute 'localtime'
I guess that some wlst operations import weblogic.time under the cover.
The only workaround I found is:
from time import localtime, strftime
strftime("%Y%m%d_%H%M%S", localtime())
This is REALLY silly, but this is real life.
Labels:
WLST
Monday, March 18, 2013
Linux and swap
run top, then O (uppercase o), p and Enter
this will show you all processes sorted by SWAP used
top computes SWAP as VIRT - RES.
( see here for more details)
To examine how swap is configured on your machine (can be file or partition):
/sbin/swapon -s
and cat /proc/swaps to view the same info.
do "cat /etc/fstab", you should see a line to mount the partition on boot:
/dev/rootvg/swaplv swap swap defaults 0 0
check your swappiness this way:
Use also vmstat 5 to trace si and so (swap in and swapout)
this will show you all processes sorted by SWAP used
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ SWAP COMMAND 16103 soa3 23 0 3109m 1.3g 34m S 0.0 8.3 12:02.82 1.7g java 4091 soa3 19 0 3115m 1.5g 34m S 0.0 9.7 11:35.79 1.5g java 12193 soa 20 0 1122m 293m 8764 S 0.0 1.8 0:23.88 828m java 25674 soa3 21 0 2152m 1.3g 50m S 0.0 8.5 7:38.07 786m java 17344 soa2 21 0 2355m 1.7g 51m S 0.0 10.6 16:16.17 652m java 3435 soa2 21 0 3326m 2.6g 35m S 0.0 16.7 43:47.83 639m java
top computes SWAP as VIRT - RES.
( see here for more details)
To examine how swap is configured on your machine (can be file or partition):
/sbin/swapon -s
Filename Type Size Used Priority /dev/mapper/rootvg-swaplv partition 8388600 188 -1
and cat /proc/swaps to view the same info.
do "cat /etc/fstab", you should see a line to mount the partition on boot:
/dev/rootvg/swaplv swap swap defaults 0 0
check your swappiness this way:
cat /proc/sys/vm/swappiness 60
Use also vmstat 5 to trace si and so (swap in and swapout)
Labels:
swap
java.lang.Runtime.getRuntime().exec leaking pipes
I do this in a WebApp in Tomcat:
but apparently according to this post: http://mark.koli.ch/2011/01/leaky-pipes-remember-to-close-your-streams-when-using-javas-runtimegetruntimeexec.html this can leak resources.
To investigate, I run the PRICELESS lsof on the Tomcat pid 12193:
/usr/sbin/lsof -a -p 12193
and I notice after each run of getRuntime().exec 3 extra pipes:
who are never closed.
I will definitely add the code
Process proc = java.lang.Runtime.getRuntime().exec("generateSQLDeveloperConnections.sh"); proc.waitFor();
but apparently according to this post: http://mark.koli.ch/2011/01/leaky-pipes-remember-to-close-your-streams-when-using-javas-runtimegetruntimeexec.html this can leak resources.
To investigate, I run the PRICELESS lsof on the Tomcat pid 12193:
/usr/sbin/lsof -a -p 12193
and I notice after each run of getRuntime().exec 3 extra pipes:
java 12193 soa 76w FIFO 0,6 0t0 14925870 pipe java 12193 soa 77r FIFO 0,6 0t0 14925871 pipe java 12193 soa 79r FIFO 0,6 0t0 14925872 pipe
who are never closed.
I will definitely add the code
import static org.apache.commons.io.IOUtils.closeQuietly; Process p = null; try { p = Runtime.getRuntime().exec(...); // Do something with p. } finally { if(p != null) { closeQuietly(p.getOutputStream()); closeQuietly(p.getInputStream()); closeQuietly(p.getErrorStream()); } }and see if this takes care of the issue.
Labels:
lsof
Saturday, March 16, 2013
Google Reader is dead, long live Google reader
I was taken aback reading that Google Reader will go in July.... it was a very good friend, and I really feel betrayed by Google.
So now I go hunting for the next Google Reader, possibly with the same minimalistic interface.
I am trying to use "The Old Reader" but it seems overloaded at the moment, as everybody is jumping out of the Google Reader window. It will take some weeks for the dust to settle.
Here some posts on the topic
http://www.mercurynews.com/business/ci_22798499/google-reader-alternatives-good-meh-and-great
http://blog.superfeedr.com/state-of-readers/
http://download.html.it/articoli/google-reader-chiude-ecco-le-alternative (in Italian)
So now I go hunting for the next Google Reader, possibly with the same minimalistic interface.
I am trying to use "The Old Reader" but it seems overloaded at the moment, as everybody is jumping out of the Google Reader window. It will take some weeks for the dust to settle.
Here some posts on the topic
http://www.mercurynews.com/business/ci_22798499/google-reader-alternatives-good-meh-and-great
http://blog.superfeedr.com/state-of-readers/
http://download.html.it/articoli/google-reader-chiude-ecco-le-alternative (in Italian)
Tuesday, March 12, 2013
OSB. chunked transfer encoding
Sendin a request to SOAPUI in automated (testrunner) way, we can see:
Accept-Encoding : gzip,deflate
the response comes back with
Response Headers: Transfer-Encoding : chunked
For explanations, see http://en.wikipedia.org/wiki/HTTP_compression#Client.2FServer_compression_scheme_negotiation
and http://en.wikipedia.org/wiki/Chunked_transfer_encoding
"HTTP servers sometimes use compression (gzip) or deflate methods to optimize transmission. How both chunked and gzip encoding interact is dictated by the two-staged encoding of HTTP: first the content stream is encoded as (Content-Encoding: gzip), after which the resulting byte stream is encoded for transfer using another encoder (Transfer-Encoding: chunked). This means that in case both compression and chunked encoding are enabled, the chunk encoding itself is not compressed, and the data in each chunk should not be compressed individually. The remote endpoint can decode the incoming stream by first decoding it with the Transfer-Encoding, followed by the specified Content-Encoding."
How can this "chunked" happen?
Is there any HTTP Business Service along the path?
If yes, we should check if it has got Chunked Streaming Mode disabled:
http://docs.oracle.com/cd/E17904_01/doc.1111/e15866/transports.htm
"Use Chunked Streaming Mode Select this option if you want to use HTTP chunked transfer encoding to send messages.
Note: Do not use chunked streaming with if you use the Follow HTTP Redirects option. Redirection and authentication cannot be handled automatically in chunked mode."
Also check that Chunking Threshold is disabled in SOAPUI :
http://www.soapui.org/Working-with-soapUI/preferences.html
More on the topic:
http://blog.ipnweb.com/2012/09/use-chunked-streaming-mode-in-osb-11g.html
Understanding Chunked Streaming
Oracle documentation states that the Chunked Streaming Mode property should be selected "if you want to use HTTP chunked transfer encoding to send messages." You normally want to enable chunked streaming if possible (with my problem above, it is not possible).
Chunked transfer encoding is an HTTP 1.1 specification, and allows clients to parse dynamic data immediately after the first chunk is read. Note that the Oracle documentation also states not to enable chunked streaming if you use the Follow HTTP Redirects option, as redirection and authentication cannot be handled automatically in chunked mode.
Accept-Encoding : gzip,deflate
the response comes back with
Response Headers: Transfer-Encoding : chunked
For explanations, see http://en.wikipedia.org/wiki/HTTP_compression#Client.2FServer_compression_scheme_negotiation
and http://en.wikipedia.org/wiki/Chunked_transfer_encoding
"HTTP servers sometimes use compression (gzip) or deflate methods to optimize transmission. How both chunked and gzip encoding interact is dictated by the two-staged encoding of HTTP: first the content stream is encoded as (Content-Encoding: gzip), after which the resulting byte stream is encoded for transfer using another encoder (Transfer-Encoding: chunked). This means that in case both compression and chunked encoding are enabled, the chunk encoding itself is not compressed, and the data in each chunk should not be compressed individually. The remote endpoint can decode the incoming stream by first decoding it with the Transfer-Encoding, followed by the specified Content-Encoding."
How can this "chunked" happen?
Is there any HTTP Business Service along the path?
If yes, we should check if it has got Chunked Streaming Mode disabled:
http://docs.oracle.com/cd/E17904_01/doc.1111/e15866/transports.htm
"Use Chunked Streaming Mode Select this option if you want to use HTTP chunked transfer encoding to send messages.
Note: Do not use chunked streaming with if you use the Follow HTTP Redirects option. Redirection and authentication cannot be handled automatically in chunked mode."
Also check that Chunking Threshold is disabled in SOAPUI :
http://www.soapui.org/Working-with-soapUI/preferences.html
More on the topic:
http://blog.ipnweb.com/2012/09/use-chunked-streaming-mode-in-osb-11g.html
Understanding Chunked Streaming
Oracle documentation states that the Chunked Streaming Mode property should be selected "if you want to use HTTP chunked transfer encoding to send messages." You normally want to enable chunked streaming if possible (with my problem above, it is not possible).
Chunked transfer encoding is an HTTP 1.1 specification, and allows clients to parse dynamic data immediately after the first chunk is read. Note that the Oracle documentation also states not to enable chunked streaming if you use the Follow HTTP Redirects option, as redirection and authentication cannot be handled automatically in chunked mode.
Monday, March 11, 2013
htop, slightly better than top
I found out by chance that it's available on our Linux RHEL 2.6
http://htop.sourceforge.net/
It's more user-friendly than top, I this I will use it in future.
http://htop.sourceforge.net/
It's more user-friendly than top, I this I will use it in future.
Labels:
top
Saturday, March 9, 2013
Recommended books about Shoah
The first is my favourite book, the first book I would grab if I was to be sent to exile in a desert island. It's more a meditation over human nature than a detailed account of the Buna-Monowitz work camp: http://en.wikipedia.org/wiki/If_This_Is_a_Man (Ist das ein Mensch, Se questo è un uomo) by Primo Levi,
probably one of the most intelligent men of his generation, a Chemist and a literate. They say he committed suicide in 1987 but it's bullshit, he was under medication and he lost balance and fell. The companion book "The Truce" is also very worth a reading, telling how after the liberation from the Red Army he came back from Russia to Italy basically by foot.
Another book I am reading is From the Ashes of Sobibor, by Thomas Blatt; the literaly style is very simple by it really captures you. From which you learn that most Ukrainian and Polish people - mostly Catholic - were very happy to denounce and rob Jews - the exceptions were very rare.
Then there is Five Chimneys, a sober and detailed account of a woman's struggle for survival in Birkenau. Maybe not as good as the previous book, because it focuses exclusively on the details of the concentration camp, leaving apart any political analysis, but still a very valuable and sobering source of information.
Excellent is also Inside the Gas Chambers: Eight Months in the Sonderkommando of Auschwitz, by Shlomo Venezia, a dry and concise account of the struggle to stay human in the camp.
Worth reading is Adam Czerniakowsky Diary from the Warsaw Getto, a stern day to day account of the German occupation of Warsaw and segregation of the Jews in the Getto.
probably one of the most intelligent men of his generation, a Chemist and a literate. They say he committed suicide in 1987 but it's bullshit, he was under medication and he lost balance and fell. The companion book "The Truce" is also very worth a reading, telling how after the liberation from the Red Army he came back from Russia to Italy basically by foot.
Another book I am reading is From the Ashes of Sobibor, by Thomas Blatt; the literaly style is very simple by it really captures you. From which you learn that most Ukrainian and Polish people - mostly Catholic - were very happy to denounce and rob Jews - the exceptions were very rare.
Then there is Five Chimneys, a sober and detailed account of a woman's struggle for survival in Birkenau. Maybe not as good as the previous book, because it focuses exclusively on the details of the concentration camp, leaving apart any political analysis, but still a very valuable and sobering source of information.
Excellent is also Inside the Gas Chambers: Eight Months in the Sonderkommando of Auschwitz, by Shlomo Venezia, a dry and concise account of the struggle to stay human in the camp.
Worth reading is Adam Czerniakowsky Diary from the Warsaw Getto, a stern day to day account of the German occupation of Warsaw and segregation of the Jews in the Getto.
Labels:
books
Chronic calf spasm, flat feet
I suffer (a lot) for this debilitating condition, mainly due to flat feet.
I found this excellent thread http://www.letsrun.com/forum/flat_read.php?thread=1448905, which goes on forever, so I try to consolidate here the possibile treatment:
ASHTANGA YOGA, SUN SALUTATION :
TheStick https://www.thestick.com/
Diet: have Potassium Chloride, bananas, avoid coffee and alcohol
RICE (Rest, Ice, Compression, Elevation): cycles of hot and cold with massage using heating balm creams
deep tissue massage, advanced release technique
use the right (?) kind of running shoes - sometimes arch support helps
run/walk bare foot
warm up (bicycle) before exercise
strengthen your calfes and adductors:
I found this excellent thread http://www.letsrun.com/forum/flat_read.php?thread=1448905, which goes on forever, so I try to consolidate here the possibile treatment:
ASHTANGA YOGA, SUN SALUTATION :
TheStick https://www.thestick.com/
Diet: have Potassium Chloride, bananas, avoid coffee and alcohol
RICE (Rest, Ice, Compression, Elevation): cycles of hot and cold with massage using heating balm creams
deep tissue massage, advanced release technique
use the right (?) kind of running shoes - sometimes arch support helps
run/walk bare foot
warm up (bicycle) before exercise
strengthen your calfes and adductors:
Tuesday, March 5, 2013
Triggers in Oracle DB
CREATE TABLE "PVTEST" ( "PVNAME" VARCHAR2(20 BYTE) ) ; CREATE TABLE "PVLOG" ( "PVSUMMARY" VARCHAR2(20 BYTE) ) ; CREATE OR REPLACE TRIGGER PVTEST_TRIGGER BEFORE DELETE OR INSERT OR UPDATE ON PVTEST FOR EACH ROW BEGIN insert into PVLOG (PVSUMMARY) values ('pippo'); END;
With this trigger, and modification will generate an extra entry in the PVLOG table.
Now I want to distinguish the operation being done:
CREATE OR REPLACE TRIGGER PVTEST_TRIGGER BEFORE DELETE OR INSERT OR UPDATE ON PVTEST FOR EACH ROW BEGIN IF DELETING then insert into PVLOG (PVSUMMARY) values ('DELETING'); END IF; IF INSERTING then insert into PVLOG (PVSUMMARY) values ('INSERTING'); END IF; IF UPDATING then insert into PVLOG (PVSUMMARY) values ('UPDATING '); END IF; END; /in the case of an UPDATE, you have the 2 variables :OLD and :NEW pointing to the old and new record.
The real pity is that there doesn't seem to be a way to declare a trigger for ANY table in the schema, and retrieving the table being affected with a :TABLE variable. You can define SCHEMA TRIGGERS, but they cannot be defined to catch INSERT DELETE UPDATE events.
Axis 1.6.1 or Flash Builder 4.6: Unsupported content Simple Content
if you get "log4j:WARN Please initialize the log4j system properly",
create the file log4j.properties in C:\apps\axis2-1.6.1 (AXIS_HOME):
wsdl2java -d adb -uri yourwslduri
I get
while if I use -d xmlbeans, all goes fine.
I see this code:
This org.apache.axis2.schema.SchemaCompiler is in axis2-adb-codegen-1.6.1.jar
I have downloaded the source code, and added:
at the beginning of copyMetaInfoHierarchy (I had also to make sure that parentSchema is declared as org.apache.ws.commons.schema.XmlSchema.
I found out that the offending element is a
I think adb doesn't like the simpleContent http://www.w3schools.com/schema/el_simpleContent.asp, and this is the JIRA issue (July 2012) :
https://issues.apache.org/jira/browse/AXIS2-5357
it looks like even Axis 1.6.2 (April 2012) is not handling this syntax.
I bet my ass that Flash Builder is using Axis2 under the cover
log4j.rootLogger=debug, stdout # Direct log messages to stdout log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%nAfter that I see that by running:
wsdl2java -d adb -uri yourwslduri
I get
2013-03-05 11:41:46 DEBUG ProjectResourceBundle:70 - org.apache.axis2.schema.i18n.resource::handleGe tObject(schema.unsupportedcontenterror) Exception in thread "main" org.apache.axis2.wsdl.codegen.CodeGenerationException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.generate(CodeGenerationEngine.java:293) at org.apache.axis2.wsdl.WSDL2Code.main(WSDL2Code.java:35) at org.apache.axis2.wsdl.WSDL2Java.main(WSDL2Java.java:24) Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at org.apache.axis2.wsdl.codegen.extension.SimpleDBExtension.engage(SimpleDBExtension.java:53) at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.generate(CodeGenerationEngine.java:246 ) ... 2 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.axis2.wsdl.codegen.extension.SimpleDBExtension.engage(SimpleDBExtension.java:50) ... 3 more Caused by: org.apache.axis2.schema.SchemaCompilationException: Unsupported content Simple Content ! at org.apache.axis2.schema.SchemaCompiler.copyMetaInfoHierarchy(SchemaCompiler.java:1412) at org.apache.axis2.schema.SchemaCompiler.processComplexContent(SchemaCompiler.java:1278) at org.apache.axis2.schema.SchemaCompiler.processContentModel(SchemaCompiler.java:1227) at org.apache.axis2.schema.SchemaCompiler.processComplexType(SchemaCompiler.java:1171) at org.apache.axis2.schema.SchemaCompiler.processNamedComplexSchemaType(SchemaCompiler.java:1091) at org.apache.axis2.schema.SchemaCompiler.processSchema(SchemaCompiler.java:1005) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:644) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:614) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:422) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:381) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:381) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:381) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:291) at org.apache.axis2.schema.ExtensionUtility.invoke(ExtensionUtility.java:102) ... 8 more
while if I use -d xmlbeans, all goes fine.
I see this code:
XmlSchemaType type = resolvedSchema.getTypeByName(baseTypeName); XmlSchemaComplexType complexType = (XmlSchemaComplexType) type; XmlSchemaContentModel content = complexType.getContentModel(); .... if (content instanceof XmlSchemaSimpleContent) { throw new SchemaCompilationException(SchemaCompilerMessages.getMessage("schema.unsupportedcontenterror", "Simple Content"));
This org.apache.axis2.schema.SchemaCompiler is in axis2-adb-codegen-1.6.1.jar
I have downloaded the source code, and added:
try { log.warn("copyMetaInfoHierarchy " + baseTypeName.toString() + " " + parentSchema.toString()); System.out.println("copyMetaInfoHierarchy " + baseTypeName.toString() + " " + parentSchema.toString()); } catch (Throwable t) { log.error("copyMetaInfoHierarchy", t); t.printStackTrace(); }
at the beginning of copyMetaInfoHierarchy (I had also to make sure that parentSchema is declared as org.apache.ws.commons.schema.XmlSchema.
I found out that the offending element is a
<xsd:complexType name="ItemClassificationCodeType"> <xsd:simpleContent> <xsd:extension base="tns:CodeType" /> </xsd:simpleContent> </xsd:complexType> <xsd:complexType name="CodeType"> <xsd:simpleContent> <xsd:extension base="xsd:normalizedString"> <xsd:attribute name="listID" type="xsd:token" use="optional"> </xsd:attribute> <xsd:attribute name="listAgencyID" type="xsd:anyURI" use="optional"> </xsd:attribute> </xsd:extension> </xsd:simpleContent> </xsd:complexType>
I think adb doesn't like the simpleContent http://www.w3schools.com/schema/el_simpleContent.asp, and this is the JIRA issue (July 2012) :
https://issues.apache.org/jira/browse/AXIS2-5357
it looks like even Axis 1.6.2 (April 2012) is not handling this syntax.
I bet my ass that Flash Builder is using Axis2 under the cover
Labels:
Axis2
Monday, March 4, 2013
OSB: project level vs resource level export
in a sbconfig.jar, ExportInfo contains a metadata section "imp:properties xmlns:imp="http://www.bea.com/wli/config/importexport" containing "projectLevelExport" (true/false).
This value affects the way a project is imported with ALSBConfigurationMBean.importUploaded: http://docs.oracle.com/cd/E17904_01/apirefs.1111/e15033/com/bea/wli/sb/management/importexport/ALSBImportPlan.html,
If the config jar file was exported at the resource level: if the resource is not in the jar file but in the domain: Skip (No-op)
if the config jar file was exported at the project level (this behavior affects only the resources in the projects that are found in the jar file.) if the resource is not in the jar file but in the domain: Delete
There is even a method to test this value http://docs.oracle.com/cd/E17904_01/apirefs.1111/e15033/com/bea/wli/sb/management/importexport/ALSBJarInfo.html#isProjectLevelExport__
There is no way to set it. So you MUST generate the sbconfig.jar with already the projectLevelExport set to true.
The command by which we export the resources (project) is:
java -Xms384m -Xmx768m -XX:MaxPermSize=256M -Dosgi.bundlefile.limit=1000 -Dosgi.nl=en_US -Dosb.home=${osb_home} -Dweblogic.home=${weblogic_home} -Dharvester.home=${harvester_home} -Dsun.lang.ClassLoader.allowArraySyntax=true -jar ${eclipse_home}/plugins/org.eclipse.equinox.launcher_1.1.1.R36x_v20101122_1400.jar -data ${bamboo_builddir} -application com.bea.alsb.core.ConfigExport -configProject ${configProject} -configJar ${tempdir}sbconfig.jar -configSubProjects "${PROJECT}" -includeDependencies true 2> ${errorlog}
http://docs.oracle.com/cd/E21764_01/doc.1111/e15866/tasks.htm#i1130159
I really have no clue on how to control the value of that parameter. All I see is that by manually changing it from
imp:property name="projectLevelExport" value="false"
to
imp:property name="projectLevelExport" value="true"
in the ExportInfo, the way the project is imported changes accordingly.
This value affects the way a project is imported with ALSBConfigurationMBean.importUploaded: http://docs.oracle.com/cd/E17904_01/apirefs.1111/e15033/com/bea/wli/sb/management/importexport/ALSBImportPlan.html,
If the config jar file was exported at the resource level: if the resource is not in the jar file but in the domain: Skip (No-op)
if the config jar file was exported at the project level (this behavior affects only the resources in the projects that are found in the jar file.) if the resource is not in the jar file but in the domain: Delete
There is even a method to test this value http://docs.oracle.com/cd/E17904_01/apirefs.1111/e15033/com/bea/wli/sb/management/importexport/ALSBJarInfo.html#isProjectLevelExport__
There is no way to set it. So you MUST generate the sbconfig.jar with already the projectLevelExport set to true.
The command by which we export the resources (project) is:
java -Xms384m -Xmx768m -XX:MaxPermSize=256M -Dosgi.bundlefile.limit=1000 -Dosgi.nl=en_US -Dosb.home=${osb_home} -Dweblogic.home=${weblogic_home} -Dharvester.home=${harvester_home} -Dsun.lang.ClassLoader.allowArraySyntax=true -jar ${eclipse_home}/plugins/org.eclipse.equinox.launcher_1.1.1.R36x_v20101122_1400.jar -data ${bamboo_builddir} -application com.bea.alsb.core.ConfigExport -configProject ${configProject} -configJar ${tempdir}sbconfig.jar -configSubProjects "${PROJECT}" -includeDependencies true 2> ${errorlog}
http://docs.oracle.com/cd/E21764_01/doc.1111/e15866/tasks.htm#i1130159
I really have no clue on how to control the value of that parameter. All I see is that by manually changing it from
imp:property name="projectLevelExport" value="false"
to
imp:property name="projectLevelExport" value="true"
in the ExportInfo, the way the project is imported changes accordingly.
Labels:
osbimportexport
Subscribe to:
Posts (Atom)