org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>>> processReport: Received first block report from >>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>> starting up or becoming active. Its block contents are no longer considered >>>> stale >>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>> DatanodeRegistration(10.10.10.63, >>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>> ipcPort=50020, >>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>> 2015-01-14 11:03:38,349 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Rescanning after 30000 milliseconds >>>> 2015-01-14 11:03:38,350 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>> >>>> >>>> Thanks >>>> Mahesh.S >>>> >>>> >>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>> wrote: >>>> >>>>> Hi Mahesh, >>>>> We will need the namenode logs to debug this further. Can you >>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>> Thanks. >>>>> >>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>> sankarmahesh37@gmail.com> wrote: >>>>> >>>>>> Hi Ramesh, >>>>>> >>>>>> I didnt see any exception in the hdfs logs.my >>>>>> problem is agent for hdfs is not created. >>>>>> >>>>>> Regards, >>>>>> Mahesh.S >>>>>> >>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com> >>>>>> wrote: >>>>>> >>>>>>> Hi Mahesh, >>>>>>> >>>>>>> The error you are seeing in is just a notice that parent folder >>>>>>> of the resource you are creating doesn’t have read permission for the user >>>>>>> whom you are creating the policy. >>>>>>> >>>>>>> when you start the hdfs namenode and secondarynode do you see any >>>>>>> exception in the hdfs logs? >>>>>>> >>>>>>> Regards, >>>>>>> Ramesh >>>>>>> >>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>> >>>>>>> Hi all, >>>>>>> >>>>>>> I successfully configured ranger admin,user sync.now am trying to >>>>>>> configure hdfs plugin.My steps are following, >>>>>>> >>>>>>> 1.Created repository testhdfs. >>>>>>> 2.cd /usr/local >>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>> 5.cd ranger-hdfs-plugin >>>>>>> 6.vi install.properties >>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>> REPOSITORY_NAME=testhdfs >>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>> 7.cd /usr/local/hadoop >>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>> 11../enable-hdfs-plugin.sh >>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>> 13.vi xasecure-audit.xml >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> 14.Restarted hadoop >>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>> agent is not created.Am i missed any steps. >>>>>>> >>>>>>> *NOTE:I am not using HDP.* >>>>>>> >>>>>>> *here is my xa_portal.log* >>>>>>> >>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>> create:nfsnobody@bigdata >>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>> javax.ws.rs.WebApplicationException >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>> at >>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>> at >>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>>>> at >>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>>>>>> at >>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)

incubator-ranger-user | Mahesh Sankaran | 2 years ago
  1. 0

    Re: Hdfs agent not created

    incubator-ranger-user | 2 years ago | Mahesh Sankaran
    org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>>> processReport: Received first block report from >>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>> starting up or becoming active. Its block contents are no longer considered >>>> stale >>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>> DatanodeRegistration(10.10.10.63, >>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>> ipcPort=50020, >>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>> 2015-01-14 11:03:38,349 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Rescanning after 30000 milliseconds >>>> 2015-01-14 11:03:38,350 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>> >>>> >>>> Thanks >>>> Mahesh.S >>>> >>>> >>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>> wrote: >>>> >>>>> Hi Mahesh, >>>>> We will need the namenode logs to debug this further. Can you >>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>> Thanks. >>>>> >>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>> sankarmahesh37@gmail.com> wrote: >>>>> >>>>>> Hi Ramesh, >>>>>> >>>>>> I didnt see any exception in the hdfs logs.my >>>>>> problem is agent for hdfs is not created. >>>>>> >>>>>> Regards, >>>>>> Mahesh.S >>>>>> >>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com> >>>>>> wrote: >>>>>> >>>>>>> Hi Mahesh, >>>>>>> >>>>>>> The error you are seeing in is just a notice that parent folder >>>>>>> of the resource you are creating doesn’t have read permission for the user >>>>>>> whom you are creating the policy. >>>>>>> >>>>>>> when you start the hdfs namenode and secondarynode do you see any >>>>>>> exception in the hdfs logs? >>>>>>> >>>>>>> Regards, >>>>>>> Ramesh >>>>>>> >>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>> >>>>>>> Hi all, >>>>>>> >>>>>>> I successfully configured ranger admin,user sync.now am trying to >>>>>>> configure hdfs plugin.My steps are following, >>>>>>> >>>>>>> 1.Created repository testhdfs. >>>>>>> 2.cd /usr/local >>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>> 5.cd ranger-hdfs-plugin >>>>>>> 6.vi install.properties >>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>> REPOSITORY_NAME=testhdfs >>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>> 7.cd /usr/local/hadoop >>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>> 11../enable-hdfs-plugin.sh >>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>> 13.vi xasecure-audit.xml >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> 14.Restarted hadoop >>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>> agent is not created.Am i missed any steps. >>>>>>> >>>>>>> *NOTE:I am not using HDP.* >>>>>>> >>>>>>> *here is my xa_portal.log* >>>>>>> >>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>> create:nfsnobody@bigdata >>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>> javax.ws.rs.WebApplicationException >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>> at >>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>> at >>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>>>> at >>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>>>>>> at >>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
  2. 0

    Re: Hdfs agent not created

    incubator-ranger-user | 2 years ago | Gautam Borad
    org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>> processReport: Received first block report from >>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>> starting up or becoming active. Its block contents are no longer considered >>> stale >>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>> DatanodeRegistration(10.10.10.63, >>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>> ipcPort=50020, >>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>> 2015-01-14 11:03:38,349 INFO >>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>> Rescanning after 30000 milliseconds >>> 2015-01-14 11:03:38,350 INFO >>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>> >>> >>> Thanks >>> Mahesh.S >>> >>> >>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> wrote: >>> >>>> Hi Mahesh, >>>> We will need the namenode logs to debug this further. Can you >>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>> Thanks. >>>> >>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>> sankarmahesh37@gmail.com> wrote: >>>> >>>>> Hi Ramesh, >>>>> >>>>> I didnt see any exception in the hdfs logs.my >>>>> problem is agent for hdfs is not created. >>>>> >>>>> Regards, >>>>> Mahesh.S >>>>> >>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com> >>>>> wrote: >>>>> >>>>>> Hi Mahesh, >>>>>> >>>>>> The error you are seeing in is just a notice that parent folder of >>>>>> the resource you are creating doesn’t have read permission for the user >>>>>> whom you are creating the policy. >>>>>> >>>>>> when you start the hdfs namenode and secondarynode do you see any >>>>>> exception in the hdfs logs? >>>>>> >>>>>> Regards, >>>>>> Ramesh >>>>>> >>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>> sankarmahesh37@gmail.com> wrote: >>>>>> >>>>>> Hi all, >>>>>> >>>>>> I successfully configured ranger admin,user sync.now am trying to >>>>>> configure hdfs plugin.My steps are following, >>>>>> >>>>>> 1.Created repository testhdfs. >>>>>> 2.cd /usr/local >>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>> 5.cd ranger-hdfs-plugin >>>>>> 6.vi install.properties >>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>> REPOSITORY_NAME=testhdfs >>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>> 7.cd /usr/local/hadoop >>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>> 11../enable-hdfs-plugin.sh >>>>>> 12.cp /usr/local/hadoop/lib/* /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>> 13.vi xasecure-audit.xml >>>>>> <property> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>> </property> >>>>>> <property> >>>>>> >>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>> <value>rangerlogger</value> >>>>>> </property> >>>>>> <property> >>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>> <value>rangerlogger</value> >>>>>> </property> >>>>>> 14.Restarted hadoop >>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>> agent is not created.Am i missed any steps. >>>>>> >>>>>> *NOTE:I am not using HDP.* >>>>>> >>>>>> *here is my xa_portal.log* >>>>>> >>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_default.properties] >>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_system.properties] >>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_custom.properties] >>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_ldap.properties] >>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>> Unable to load native-hadoop library for your platform... using >>>>>> builtin-java classes where applicable >>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [db_message_bundle.properties] >>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>> user >>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>> requestId=10.10.10.53 >>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>> create:nfsnobody@bigdata >>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_default.properties] >>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_system.properties] >>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_custom.properties] >>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [xa_ldap.properties] >>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>> Unable to load native-hadoop library for your platform... using >>>>>> builtin-java classes where applicable >>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>> path resource [db_message_bundle.properties] >>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>> user >>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>> requestId=10.10.10.53 >>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>> user >>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>> requestId=10.10.10.53 >>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>> Login: security not enabled, using username >>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>> Login: security not enabled, using username >>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>> permission on parent folder. Do you want to save this policy? >>>>>> javax.ws.rs.WebApplicationException >>>>>> at >>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>> at >>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>> at >>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>> at >>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>> at >>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>> at >>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>> at >>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>> at >>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>> at >>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>> at >>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>> at >>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  3. 0

    Re: Hdfs agent not created

    incubator-ranger-user | 2 years ago | Gautam Borad
    org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>>>> processReport: Received first block report from >>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>>> starting up or becoming active. Its block contents are no longer considered >>>>> stale >>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>>> DatanodeRegistration(10.10.10.63, >>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>>> ipcPort=50020, >>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>>> 2015-01-14 11:03:38,349 INFO >>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>> Rescanning after 30000 milliseconds >>>>> 2015-01-14 11:03:38,350 INFO >>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>>> >>>>> >>>>> Thanks >>>>> Mahesh.S >>>>> >>>>> >>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>>> wrote: >>>>> >>>>>> Hi Mahesh, >>>>>> We will need the namenode logs to debug this further. Can you >>>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>>> Thanks. >>>>>> >>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>>> sankarmahesh37@gmail.com> wrote: >>>>>> >>>>>>> Hi Ramesh, >>>>>>> >>>>>>> I didnt see any exception in the hdfs logs.my >>>>>>> problem is agent for hdfs is not created. >>>>>>> >>>>>>> Regards, >>>>>>> Mahesh.S >>>>>>> >>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com> >>>>>>> wrote: >>>>>>> >>>>>>>> Hi Mahesh, >>>>>>>> >>>>>>>> The error you are seeing in is just a notice that parent folder >>>>>>>> of the resource you are creating doesn’t have read permission for the user >>>>>>>> whom you are creating the policy. >>>>>>>> >>>>>>>> when you start the hdfs namenode and secondarynode do you see any >>>>>>>> exception in the hdfs logs? >>>>>>>> >>>>>>>> Regards, >>>>>>>> Ramesh >>>>>>>> >>>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>>> >>>>>>>> Hi all, >>>>>>>> >>>>>>>> I successfully configured ranger admin,user sync.now am trying to >>>>>>>> configure hdfs plugin.My steps are following, >>>>>>>> >>>>>>>> 1.Created repository testhdfs. >>>>>>>> 2.cd /usr/local >>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>>> 5.cd ranger-hdfs-plugin >>>>>>>> 6.vi install.properties >>>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>>> REPOSITORY_NAME=testhdfs >>>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>>> 7.cd /usr/local/hadoop >>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>>> 11../enable-hdfs-plugin.sh >>>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>>> 13.vi xasecure-audit.xml >>>>>>>> <property> >>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>>> </property> >>>>>>>> <property> >>>>>>>> >>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>>> <value>rangerlogger</value> >>>>>>>> </property> >>>>>>>> <property> >>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>>> <value>rangerlogger</value> >>>>>>>> </property> >>>>>>>> 14.Restarted hadoop >>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>>> agent is not created.Am i missed any steps. >>>>>>>> >>>>>>>> *NOTE:I am not using HDP.* >>>>>>>> >>>>>>>> *here is my xa_portal.log* >>>>>>>> >>>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_default.properties] >>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_system.properties] >>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_custom.properties] >>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_ldap.properties] >>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>> builtin-java classes where applicable >>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [db_message_bundle.properties] >>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>> user >>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>>> requestId=10.10.10.53 >>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>>> create:nfsnobody@bigdata >>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_default.properties] >>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_system.properties] >>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_custom.properties] >>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [xa_ldap.properties] >>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>> builtin-java classes where applicable >>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>> path resource [db_message_bundle.properties] >>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>> user >>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>>> requestId=10.10.10.53 >>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>> user >>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>>> requestId=10.10.10.53 >>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>> Login: security not enabled, using username >>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>> Login: security not enabled, using username >>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>>> javax.ws.rs.WebApplicationException >>>>>>>> at >>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>>> at >>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>>> at >>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>>> at >>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>>> at >>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>>> at >>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>>> at >>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>>> at >>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>>> at >>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>>> at >>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>>> at >>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Re: Hdfs agent not created

    incubator-ranger-user | 2 years ago | Mahesh Sankaran
    org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>>>>> processReport: Received first block report from >>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>>>> starting up or becoming active. Its block contents are no longer considered >>>>>> stale >>>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>>>> DatanodeRegistration(10.10.10.63, >>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>>>> ipcPort=50020, >>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>>>> 2015-01-14 11:03:38,349 INFO >>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>>> Rescanning after 30000 milliseconds >>>>>> 2015-01-14 11:03:38,350 INFO >>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>>>> >>>>>> >>>>>> Thanks >>>>>> Mahesh.S >>>>>> >>>>>> >>>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>>>> wrote: >>>>>> >>>>>>> Hi Mahesh, >>>>>>> We will need the namenode logs to debug this further. Can you >>>>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>>>> Thanks. >>>>>>> >>>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>> >>>>>>>> Hi Ramesh, >>>>>>>> >>>>>>>> I didnt see any exception in the hdfs logs.my >>>>>>>> problem is agent for hdfs is not created. >>>>>>>> >>>>>>>> Regards, >>>>>>>> Mahesh.S >>>>>>>> >>>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com >>>>>>>> > wrote: >>>>>>>> >>>>>>>>> Hi Mahesh, >>>>>>>>> >>>>>>>>> The error you are seeing in is just a notice that parent folder >>>>>>>>> of the resource you are creating doesn’t have read permission for the user >>>>>>>>> whom you are creating the policy. >>>>>>>>> >>>>>>>>> when you start the hdfs namenode and secondarynode do you see >>>>>>>>> any exception in the hdfs logs? >>>>>>>>> >>>>>>>>> Regards, >>>>>>>>> Ramesh >>>>>>>>> >>>>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>>>> >>>>>>>>> Hi all, >>>>>>>>> >>>>>>>>> I successfully configured ranger admin,user sync.now am trying >>>>>>>>> to configure hdfs plugin.My steps are following, >>>>>>>>> >>>>>>>>> 1.Created repository testhdfs. >>>>>>>>> 2.cd /usr/local >>>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>>>> 5.cd ranger-hdfs-plugin >>>>>>>>> 6.vi install.properties >>>>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>>>> REPOSITORY_NAME=testhdfs >>>>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>>>> 7.cd /usr/local/hadoop >>>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>>>> 11../enable-hdfs-plugin.sh >>>>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>>>> 13.vi xasecure-audit.xml >>>>>>>>> <property> >>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>>>> </property> >>>>>>>>> <property> >>>>>>>>> >>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>>>> <value>rangerlogger</value> >>>>>>>>> </property> >>>>>>>>> <property> >>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>>>> <value>rangerlogger</value> >>>>>>>>> </property> >>>>>>>>> 14.Restarted hadoop >>>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>>>> agent is not created.Am i missed any steps. >>>>>>>>> >>>>>>>>> *NOTE:I am not using HDP.* >>>>>>>>> >>>>>>>>> *here is my xa_portal.log* >>>>>>>>> >>>>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_default.properties] >>>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_system.properties] >>>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_custom.properties] >>>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_ldap.properties] >>>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>>> builtin-java classes where applicable >>>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [db_message_bundle.properties] >>>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>> user >>>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>>>> requestId=10.10.10.53 >>>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>>>> create:nfsnobody@bigdata >>>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_default.properties] >>>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_system.properties] >>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_custom.properties] >>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [xa_ldap.properties] >>>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>>> builtin-java classes where applicable >>>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>> path resource [db_message_bundle.properties] >>>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>> user >>>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>>>> requestId=10.10.10.53 >>>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>> user >>>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>>>> requestId=10.10.10.53 >>>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>>> Login: security not enabled, using username >>>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>>> Login: security not enabled, using username >>>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>>>> javax.ws.rs.WebApplicationException >>>>>>>>> at >>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>>>> at >>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>>>> at >>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>>>> at >>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>>>> at >>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>>>> at >>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>>>> at >>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>>>> at >>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>>>> at >>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>>>> at >>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>>>> at >>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>>>> at >>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  6. 0

    Re: Hdfs agent not created

    apache.org | 11 months ago
    org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: BLOCK* >>>>>>> processReport: Received first block report from >>>>>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>>>>> starting up or becoming active. Its block contents are no longer considered >>>>>>> stale >>>>>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>>>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>>>>> DatanodeRegistration(10.10.10.63, >>>>>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>>>>> ipcPort=50020, >>>>>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>>>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>>>>> 2015-01-14 11:03:38,349 INFO >>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>>>> Rescanning after 30000 milliseconds >>>>>>> 2015-01-14 11:03:38,350 INFO >>>>>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>>>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>>>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>>>>> >>>>>>> >>>>>>> Thanks >>>>>>> Mahesh.S >>>>>>> >>>>>>> >>>>>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>>>>> wrote: >>>>>>> >>>>>>>> Hi Mahesh, >>>>>>>> We will need the namenode logs to debug this further. Can you >>>>>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>>>>> Thanks. >>>>>>>> >>>>>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>>> >>>>>>>>> Hi Ramesh, >>>>>>>>> >>>>>>>>> I didnt see any exception in the hdfs logs.my >>>>>>>>> problem is agent for hdfs is not created. >>>>>>>>> >>>>>>>>> Regards, >>>>>>>>> Mahesh.S >>>>>>>>> >>>>>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani < >>>>>>>>> rmani@hortonworks.com> wrote: >>>>>>>>> >>>>>>>>>> Hi Mahesh, >>>>>>>>>> >>>>>>>>>> The error you are seeing in is just a notice that parent >>>>>>>>>> folder of the resource you are creating doesn’t have read permission for >>>>>>>>>> the user whom you are creating the policy. >>>>>>>>>> >>>>>>>>>> when you start the hdfs namenode and secondarynode do you see >>>>>>>>>> any exception in the hdfs logs? >>>>>>>>>> >>>>>>>>>> Regards, >>>>>>>>>> Ramesh >>>>>>>>>> >>>>>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>>>>> >>>>>>>>>> Hi all, >>>>>>>>>> >>>>>>>>>> I successfully configured ranger admin,user sync.now am trying >>>>>>>>>> to configure hdfs plugin.My steps are following, >>>>>>>>>> >>>>>>>>>> 1.Created repository testhdfs. >>>>>>>>>> 2.cd /usr/local >>>>>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>>>>> 5.cd ranger-hdfs-plugin >>>>>>>>>> 6.vi install.properties >>>>>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>>>>> REPOSITORY_NAME=testhdfs >>>>>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>>>>> 7.cd /usr/local/hadoop >>>>>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>>>>> 11../enable-hdfs-plugin.sh >>>>>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>>>>> 13.vi xasecure-audit.xml >>>>>>>>>> <property> >>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>>>>> </property> >>>>>>>>>> <property> >>>>>>>>>> >>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>>>>> <value>rangerlogger</value> >>>>>>>>>> </property> >>>>>>>>>> <property> >>>>>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>>>>> <value>rangerlogger</value> >>>>>>>>>> </property> >>>>>>>>>> 14.Restarted hadoop >>>>>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>>>>> agent is not created.Am i missed any steps. >>>>>>>>>> >>>>>>>>>> *NOTE:I am not using HDP.* >>>>>>>>>> >>>>>>>>>> *here is my xa_portal.log* >>>>>>>>>> >>>>>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_default.properties] >>>>>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_system.properties] >>>>>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_custom.properties] >>>>>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_ldap.properties] >>>>>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>>>> builtin-java classes where applicable >>>>>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [db_message_bundle.properties] >>>>>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>>> user >>>>>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>>>>> requestId=10.10.10.53 >>>>>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>>>>> create:nfsnobody@bigdata >>>>>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_default.properties] >>>>>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_system.properties] >>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_custom.properties] >>>>>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [xa_ldap.properties] >>>>>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>>>>> builtin-java classes where applicable >>>>>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>>>>> path resource [db_message_bundle.properties] >>>>>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>>> user >>>>>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>>>>> requestId=10.10.10.53 >>>>>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>>>>> user >>>>>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>>>>> requestId=10.10.10.53 >>>>>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>>>> Login: security not enabled, using username >>>>>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>>>>> Login: security not enabled, using username >>>>>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>>>>> javax.ws.rs.WebApplicationException >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>>>>> at >>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>>>>> at >>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>>>>> at >>>>>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>>>>> at >>>>>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>>>>> at >>>>>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>>>>> at >>>>>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.hdfs.server.blockmanagement.BlockManager

      BLOCK* >>>> processReport: Received first block report from >>>> DatanodeStorage[DS-7989baef-c501-4a7a-b586-0f943444e099,DISK,NORMAL] after >>>> starting up or becoming active. Its block contents are no longer considered >>>> stale >>>> 2015-01-14 11:03:13,966 INFO BlockStateChange: BLOCK* processReport: >>>> from storage DS-7989baef-c501-4a7a-b586-0f943444e099 node >>>> DatanodeRegistration(10.10.10.63, >>>> datanodeUuid=e3c24b88-cb98-4a74-8c5f-fee8dba99898, infoPort=50075, >>>> ipcPort=50020, >>>> storageInfo=lv=-56;cid=CID-46a6e78b-efc4-4dc2-aabe-076bf811d759;nsid=270630615;c=0), >>>> blocks: 0, hasStaleStorages: false, processing time: 11 msecs >>>> 2015-01-14 11:03:38,349 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Rescanning after 30000 milliseconds >>>> 2015-01-14 11:03:38,350 INFO >>>> org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: >>>> Scanned 0 directive(s) and 0 block(s) in 1 millisecond(s). >>>> 2015-01-14 11:03:57,100 INFO logs: Aliases are enabled >>>> >>>> >>>> Thanks >>>> Mahesh.S >>>> >>>> >>>> On Wed, Jan 14, 2015 at 10:41 AM, Gautam Borad <gborad@gmail.com> >>>> wrote: >>>> >>>>> Hi Mahesh, >>>>> We will need the namenode logs to debug this further. Can you >>>>> restart namenode and paste the logs of that somewhere for us to analyze? >>>>> Thanks. >>>>> >>>>> On Wed, Jan 14, 2015 at 10:31 AM, Mahesh Sankaran < >>>>> sankarmahesh37@gmail.com> wrote: >>>>> >>>>>> Hi Ramesh, >>>>>> >>>>>> I didnt see any exception in the hdfs logs.my >>>>>> problem is agent for hdfs is not created. >>>>>> >>>>>> Regards, >>>>>> Mahesh.S >>>>>> >>>>>> On Tue, Jan 13, 2015 at 8:50 PM, Ramesh Mani <rmani@hortonworks.com> >>>>>> wrote: >>>>>> >>>>>>> Hi Mahesh, >>>>>>> >>>>>>> The error you are seeing in is just a notice that parent folder >>>>>>> of the resource you are creating doesn’t have read permission for the user >>>>>>> whom you are creating the policy. >>>>>>> >>>>>>> when you start the hdfs namenode and secondarynode do you see any >>>>>>> exception in the hdfs logs? >>>>>>> >>>>>>> Regards, >>>>>>> Ramesh >>>>>>> >>>>>>> On Jan 13, 2015, at 4:13 AM, Mahesh Sankaran < >>>>>>> sankarmahesh37@gmail.com> wrote: >>>>>>> >>>>>>> Hi all, >>>>>>> >>>>>>> I successfully configured ranger admin,user sync.now am trying to >>>>>>> configure hdfs plugin.My steps are following, >>>>>>> >>>>>>> 1.Created repository testhdfs. >>>>>>> 2.cd /usr/local >>>>>>> 3.sudo tar zxf ~/dev/ranger/target/ranger-0.4.0-hdfs-plugin.tar.gz >>>>>>> 4.sudo ln -s ranger-0.4.0-hdfs-plugin ranger-hdfs-plugin >>>>>>> 5.cd ranger-hdfs-plugin >>>>>>> 6.vi install.properties >>>>>>> POLICY_MGR_URL=http://IP:6080 <http://ip:6080/> >>>>>>> REPOSITORY_NAME=testhdfs >>>>>>> XAAUDIT.DB.HOSTNAME=localhost >>>>>>> XAAUDIT.DB.DATABASE_NAME=ranger >>>>>>> XAAUDIT.DB.USER_NAME=rangerlogger >>>>>>> XAAUDIT.DB.PASSWORD=rangerlogger >>>>>>> 7.cd /usr/local/hadoop >>>>>>> 8.ln -s /usr/local/hadoop/etc/hadoop conf >>>>>>> 9.export HADOOP_HOME=/usr/local/hadoop >>>>>>> 10.cd /usr/local/ranger-hdfs-plugin >>>>>>> 11../enable-hdfs-plugin.sh >>>>>>> 12.cp /usr/local/hadoop/lib/* >>>>>>> /usr/local/hadoop/share/hadoop/hdfs/lib/ >>>>>>> 13.vi xasecure-audit.xml >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.url</name> >>>>>>> <value>jdbc:mysql://localhost/ranger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.user</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> <property> >>>>>>> <name>xasecure.audit.jpa.javax.persistence.jdbc.password</name> >>>>>>> <value>rangerlogger</value> >>>>>>> </property> >>>>>>> 14.Restarted hadoop >>>>>>> when i see Ranger Admin Web interface -> Audit -> Agents >>>>>>> agent is not created.Am i missed any steps. >>>>>>> >>>>>>> *NOTE:I am not using HDP.* >>>>>>> >>>>>>> *here is my xa_portal.log* >>>>>>> >>>>>>> 2015-01-13 15:16:45,901 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:16:45,932 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:16:45,965 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:16:45,978 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:16:46,490 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:16:47,417 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:17:13,721 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=830B2C1BC6F34346950710576AD40A12 >>>>>>> 2015-01-13 15:17:14,362 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:17:14,491 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=10, sessionId=830B2C1BC6F34346950710576AD40A12, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:17:16,517 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:17:16,518 [http-bio-6080-exec-2] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:27:58,797 [http-bio-6080-exec-10] INFO >>>>>>> org.apache.ranger.rest.UserREST (UserREST.java:186) - >>>>>>> create:nfsnobody@bigdata >>>>>>> 2015-01-13 15:30:32,173 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_default.properties] >>>>>>> 2015-01-13 15:30:32,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_system.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_custom.properties] >>>>>>> 2015-01-13 15:30:32,180 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [xa_ldap.properties] >>>>>>> 2015-01-13 15:30:33,049 [localhost-startStop-1] WARN >>>>>>> org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - >>>>>>> Unable to load native-hadoop library for your platform... using >>>>>>> builtin-java classes where applicable >>>>>>> 2015-01-13 15:30:34,179 [localhost-startStop-1] INFO >>>>>>> org.springframework.core.io.support.PropertiesLoaderSupport >>>>>>> (PropertiesLoaderSupport.java:177) - Loading properties file from class >>>>>>> path resource [db_message_bundle.properties] >>>>>>> 2015-01-13 15:30:44,588 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:246) - Done rearranging. loopCount=0 >>>>>>> 2015-01-13 15:30:44,589 [http-bio-6080-exec-1] INFO >>>>>>> org.apache.ranger.service.filter.RangerRESTAPIFilter >>>>>>> (RangerRESTAPIFilter.java:254) - Loaded 0 API methods. >>>>>>> 2015-01-13 15:31:18,236 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=881E59FF1E0E5F2940A0CECC3826FAA0 >>>>>>> 2015-01-13 15:31:18,270 [http-bio-6080-exec-5] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:31:18,326 [http-bio-6080-exec-4] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=11, sessionId=881E59FF1E0E5F2940A0CECC3826FAA0, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:46:42,554 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.security.listener.SpringEventListener >>>>>>> (SpringEventListener.java:69) - Login Successful:admin | Ip >>>>>>> Address:10.10.10.53 | sessionId=375249EFD0513D997E0BDF64A288DFCD >>>>>>> 2015-01-13 15:46:42,559 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:334) - admin is a valid >>>>>>> user >>>>>>> 2015-01-13 15:46:43,858 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.biz.SessionMgr (SessionMgr.java:140) - Login Success: >>>>>>> loginId=admin, sessionId=12, sessionId=375249EFD0513D997E0BDF64A288DFCD, >>>>>>> requestId=10.10.10.53 >>>>>>> 2015-01-13 15:47:00,201 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 15:47:00,291 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 15:52:54,052 [http-bio-6080-exec-2] ERROR >>>>>>> org.apache.ranger.db.RangerDaoManager (RangerDaoManager.java:53) - >>>>>>> RangerDaoManager.getEntityManager(loggingPU) >>>>>>> 2015-01-13 16:03:06,816 [http-bio-6080-exec-2] INFO >>>>>>> apache.ranger.hadoop.client.config.BaseClient (BaseClient.java:104) - Init >>>>>>> Login: security not enabled, using username >>>>>>> 2015-01-13 16:03:06,874 [http-bio-6080-exec-2] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,740 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:20,790 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,636 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:48,680 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,062 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:51,110 [http-bio-6080-exec-4] WARN >>>>>>> org.apache.hadoop.fs.FileSystem (FileSystem.java:327) - "bigdata:9000" is >>>>>>> a deprecated filesystem name. Use "hdfs://bigdata:9000/" instead. >>>>>>> 2015-01-13 16:03:57,174 [http-bio-6080-exec-8] INFO >>>>>>> org.apache.ranger.common.RESTErrorUtil (RESTErrorUtil.java:64) - Request >>>>>>> failed. SessionId=12, loginId=admin, logMessage=Mahesh may not have read >>>>>>> permission on parent folder. Do you want to save this policy? >>>>>>> javax.ws.rs.WebApplicationException >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:55) >>>>>>> at >>>>>>> org.apache.ranger.common.RESTErrorUtil.createRESTException(RESTErrorUtil.java:264) >>>>>>> at >>>>>>> org.apache.ranger.service.XResourceService.checkAccess(XResourceService.java:546) >>>>>>> at org.apache.ranger.biz.AssetMgr.createXResource(AssetMgr.java:241) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST.createXResource(AssetREST.java:214) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$FastClassByCGLIB$$8cffcb6d.invoke(<generated>) >>>>>>> at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) >>>>>>> at >>>>>>> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) >>>>>>> at >>>>>>> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) >>>>>>> at >>>>>>> org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) >>>>>>> at >>>>>>> org.apache.ranger.rest.AssetREST$$EnhancerByCGLIB$$65ef778b.createXResource(<generated>) >>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>>>>> at >>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>>>>>> at >>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)

      at javax.servlet.http.HttpServlet.service()
    2. JavaServlet
      HttpServlet.service
      1. javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
      1 frame